This is my last official post in the role of News Editor at StepForth Placement. As of the middle of the month, I will be a free agent. As of the end of this post, I am on vacation-time. Its a strange feeling.

It has been an incredible adventure and a true privilege working here for the past six years. In that time, the website optimization and search engine marketing sector has grown from a tiny cottage industry to become one of the most significant sectors operating on the Internet. Read more…

Gravatar
Thursday, June 29th, 2006

Google Iceberg (beta)

Ever seen an iceberg? They are magnificent ice mountains, frozen floating islands bobbing around the most northern and southern oceans. Aside from the fact they are frozen, the coolest thing about an iceberg is that only about 1/10 th of the mass of the berg is visible above the water. Knowing the other 9/10 th of the mass exists below the surface adds oomph to the awe.

Google is like an iceberg. There is so much happening beneath the surface that even the most well informed observers can find themselves confounded and confused when contemplating the full Google’s spectrum of services. Apparently, a similar sensation is felt around the Googleplex where an initiative to refocus on the core mission, “… to organize the world’s information and make it universally accessible and useful,” is said to be underway.

Google has grown and diversified as rapidly as the web environment around it has, often placing itself on or even beyond the cutting edge of communication technologies. Its impact on our society and economy is hugely helpful and distressingly disruptive at the same time. Through its own innovation and a series of acquisitions, Google has managed to make an entry in most, if not all, major online marketing venues and is in the business of creating an ongoing stream of online marketing assets. It also has ambitions to venture into the traditional print, radio, and video ad markets, anticipating the inevitable migration to digital delivery of these mediums.

Much of Google’s tremendous growth was spurred by its wildly successful stock offerings. The company went public in August 2004. Before their initial public offering, Google was the most important search engine on the Internet. Slightly less than two years later, Google has become one of the three most important and influential media companies in the world. Google is going where the big-media money is, a place known for its dramatic effect on the attitudes of those who inhabit it.

Regardless of what Google is, or might actually become, the general public still thinks of it primarily as a free-for-use search engine. That perception is important to Google because information accessibility, the core of Google’s core mission, is facilitated through some form of search function. Being known as the world’s favourite search engine gives Google a significant advantage when it comes to attracting advertisers. This is the reason Google, like its rivals at Yahoo, MSN and Ask, will invest a bulk of its resources on facilitating and improving search functions. Reliable search makes loyal users.

Conversely, a significant loss of public perception in Google’s credibility is a risk. Over the past two years, Google has faced a steady stream of criticism for many of the choices it has made. Investment types criticise Google’s secretive and sometimes bizarre ways of communicating financial matters. Social activists criticise Google’s compliance with Chinese Government censorship of “sensitive” search results. Ironically (but necessarily), content creators and copyright holders criticise Google for living up to its stated mission goals.

Google actively changes the methods it uses to rank websites on a relatively constant basis. As new technologies, design methods or social trends accumulate users, the body of information those uses eventually compile will have an effect on rankings in Google’s index. That’s because Google’s spiders are designed to follow links and practically every document that exists has a link directed towards it. When blogs were first popularized about two years ago, they had a massive affect on Google rankings because of the interlinked social nature of blogs, the tendency of some to abuse the comment sections, and the sudden emergence of millions of automated or out-sourced blogs.

As a search engine, Google is facing a great deal of criticism from webmasters and online merchants who contend that Google hasn’t paid enough attention to the relevance of its organic search results. When it does pay attention, some suggest that Google’s cure for spam is often worse than the symptoms its engineers were trying to correct. The recent, six-month long algorithm updates are a case in point. Many search engine optimizers continue to scratch their heads at the strange state of Google’s search results.

The organic, or free, search results at Google have been in a constant state of flux for almost a year now. Along with the Jagger algorithm update and the Bigdaddy infrastructure upgrade, Google has been subtly introducing a few new factors to the search results most users see. Keen observers, like those at Search Engine Watch , often spot experimental insertions of content pulled from services such as Google Base, News, Maps and Google Images.

These appearances indicate a high degree of research and development around local search and attempts at providing a degree of personalized search. Google invests approximately 70% of staff resources on search functionality. While its stated, public goal involves making the world’s information accessible, its unstated, internal goal is to make its various features, functions and tools work together as a system.

Google is moving forward to integrate its various tools and functions into a stronger set of search tools, all of which are expected to be monetized primarily through AdWords advertising though some will present direct costs to consumers.

Later this week, Google is expected to release an online payment system that has been nicknamed Gbuy. While there are few confirmed details about the system, the clear intent would be to develop an online payment system that works with Google Base listings. This would give Google and its users an ecommerce platform that could seriously challenge the space eBay currently occupies.

Google is also actively assembling the components for a server-side suite of collaborative office-use software as a challenge to Microsoft’s Office suite. Last year it purchased online word processor software called Writely. Earlier this year, Google acquired an online spreadsheet application, naming its new service Google Spreadsheets. Combine an online word processor and spreadsheet application with Gmail and Google Calendar. Add Google’s photo sharing suite Picasa to act in the role of a rudimentary PowerPoint and Google Desktop’s ability to locate any file on a computer or shared network and you have the background tools included in a useful operating system.

Google is an advertising company built on providing free space to frame paid advertising around. Its first priority is to its core mission of making information available, especially if there is a way to tie advertising to it. Its other mission is to position itself for battle against Microsoft, Yahoo, News Corp., Ask, and any other contender that might happen along its path. It is aggressive, overly confident, and often perceived as arrogant. It is also one of the world’s largest media companies and responsible for providing the results of approximately half the searches made each day.

Google’s continued growth is virtually guaranteed but it is a guarantee that is entirely theirs to squander.

In their favour, they have been far more open and communicative with their constituents than their rivals have. Though unorthodox, Google’s mature CEO and mostly mature co-founders have a habit of shooting from the hip when it comes to making public comments. For search engine optimizers, Google’s quality czar, Matt Cutts provides a forum for information exchange on his blog and is the A-list celebrity at any search related function he appears at. This is a daily must read.

Working against Google is the fact that the real world of big business is wild, scandalous and fraught with difficult, literally world altering decisions. They, and their primary audience, are grassroots sorts of people suddenly working on the biggest of international stages. No matter what decisions Google makes, there will be a controversial outcome for someone.

A few weeks ago, StepForth’s sales manager, Bill Stroll, took a well deserved holiday. That gave me the opportunity to sit in his chair for a few days, monitoring emails from clients and queries from potential clients. My primary focus was to answer client questions and respond to information requests that simply couldn’t wait until Bill was back at his desk.

Sitting in front of his computer gave me a chance to take another look a random sampling of websites interested in SEO. From time to time, I tabbed over to see some of the site-review questionnaire responses our system had recently handled.

Search engine optimization is obviously becoming more popular. We’re handling a lot more review requests. Many of the sites processed by our review system were already well-designed sites ready for optimization. Many others however, were simply not up to a standard of design or topical clarity in which our SEO services would help. It’s a hard thing to tell someone but someone has to do it, the website needs to be redesigned.

Online competition has increased dramatically year after year. Today there are more websites doing business in every commercial sector than there were yesterday. Though the search engines are better able to sort information and rank relevant sites against keyword queries, achieving search engine placements for smaller sites has gotten more difficult as the environment evolves.

Recent changes to the ranking algorithms at Google and Yahoo place increased importance on site user experience, making people-friendly web design an important component in SEO. Because the search engines want to gauge the usefulness of any given document or link, they track the movement of visitors from one web document to another. When larger numbers of visitors travel to a site and spend more time gathering information and following its links, the search engines tend to think favorably about that site. Similarly, when visitors click in and quickly click out, their leaving is noted and the action is somewhat scored against the site. It’s nothing personal, its just technology judging technology.

When a website is somehow unprepared to meet the standards we believe search spiders or human visitors are looking for, we call it not-ready-for-primetime. It’s a much gentler term to use than others we’ve tossed about. Not-ready-for-primetime sites come in all shapes, sizes and represent all sorts of businesses. The one thing they have in common is that, in their current condition, their chances of achieving strong search rankings are dim. They are often constructed as if they were afterthoughts, as brochures by people focused squarely on doing business in the real world.

When we come across sites that are not quite ready for primetime, we tend to recommend site redesign. The problem with recommending redesign as a pre-requisite of SEO work is that it needs to be factored into a preset marketing budget. Often, site owners are unable or unwilling to invest in site redesign and either go seeking help or affirmation from other search marketing shops, or give up altogether.

The easiest way to avoid presenting an unfriendly face to search engine spiders is to start from the basics and work your way up. Here are a few quick tips on spotting elements of your website that might not be as search engine friendly as they could or should be.

Every website, good or bad begins with a site structure. Some structures are better for search spiders than others. There are two areas to consider when thinking about site structure, regardless of the eventual size of the site. The first how the overall site file and directory tree will look. The second is how the first few levels of that tree will be laid out.

The overall site should be structured to allow for long-term growth. As more files or documents are added to the site, the designer will want to ensure that search spiders will find those files without too much trouble. That means limiting the number of directory levels as much as practically possible.

The first few levels of a site are extremely important for strong search rankings. Documents or files found on the first few pages of any site tend to contain the most unique, descriptive content. These documents are most likely to receive incoming links from other sites and are most likely to be entrance points to specific product or services offered on pages found deeper in the site. Establishing easily followed paths for search spiders and for live-visitors is important.

The next thing that makes a site not-ready-for-primetime is topical focus and clarity of message. In a competitive search engine environment, choosing a theme and sticking to it is generally good advice.

We often see sites that try to sell hundreds of unrelated consumer items or travel services. These types of sites pose two problems. First, there is no overall theme to think about when determining keywords to target. Secondly, much of the content on sites like this is lifted from other online sources, likely already existing in Google’s index.

If these sites were to segment their sites into niches or facets of the industries they are trying to represent and build a number of sites dedicated to those facets, chances are their sites would perform much better on the search engines.

Another series of elements that can make a site not-ready-for-primetime is found in previous attempts at search engine or network marketing. A reality of web-based business is that a little information can be extremely dangerous if applied incorrectly. We often come across sites that have joined web-rings or link-exchanges, or have remnants of spammy SEO techniques left over from a previous run-in with less ethical SEOs. We tend to see these sites just after they have been delisted or have seen their rankings degrade over time.

A site redesign is a serious commitment. Once it is undertaken, a whole range of planning, copywriting and meetings are in order. This process is often good for an online business as it forces the business to focus on how it conducts business online, and how to make that business better.

Perhaps the truest measure of the need to redesign a website comes not from the needs of website marketers but from the experience of the website owners themselves. Is the site producing revenues or attracting business of some form or another? Is it capable of returning some if not all of the money invested in it? If not, the best search engine placements on the Internet are not going to be much use.

The need for search friendly design is obvious, the demand is real.

Gravatar
Wednesday, June 21st, 2006

Dear Friends…

A couple of years ago, when I was the head SEO here at StepForth Placement, that was the way I would open all letters, notices or bulletins to our clients. “Dear Friends”

While “friends” might not be the most appropriate business-like greeting, it was the one that I felt best suited the relationship I wanted to establish between the company and its most important assets (and renewable resources), our customers. Read more…

One of the most frequently asked questions readers and clients email StepForth Placement’s SEO staff, revolves around how websites can be best optimized to meet the algorithmic needs of each of the major 4 search engines, Google, Yahoo, MSN and Ask.

The more things change, the more they stay the same. Though there have been wide sweeping changes in the organic search engine landscape over the past six months, the fundamental ways search engines operate remains the same. Read more…

Founder Bill Gates will be stepping away from the bulk of his duties at Microsoft in two years time. Starting in July 2008, Gates, aged 50, will devote the majority of his time to the management of the Bill and Melinda Gates Foundation, the world’s largest charitable organization.

Gates made the announcement yesterday afternoon at a news conference held after the markets had closed for the day. Gates will remained involved as Chairman of the corporation and will retain his vast stock holdings. “I always see myself as being the largest shareholder in Microsoft,” Gates told reporters. He holds about 9.5% of all Microsoft stocks, estimated to be worth $21.6 Billion. Read more…

Search engine optimization, as a practicing sector of the greater search engine marketing industry, is seeing an upswing in business over the past few months. This trend is fueled by a number of concurrent factors, the least of which is the actual effectiveness of organic search placement.

Today’s search marketing metaverse is made up of a multiplicity of mash-ups. Paid and organic results now appear in any number of venues beyond the traditional SERPs that are directly or indirectly associated with a branded search engine or social network. While search has been big business for over five years, a growing sophistication is entering the marketing space as creative people find intelligent and interesting ways to get a growing number of applications to work together. Read more…

It is a strange phenomenon of North American society that the fight for Freedom of Information should so often come down to money. Those who have it tend to get freer access to information than those who don’t. A small group of former monopolists (who have lots of it) can even control how information moves across the ‘net, even to the point of placing virtual toll booths across the formerly free-flowing information superhighway.

That’s the impression left after reading and rereading the 62-pages of legislation named the Communications Opportunity, Promotion and Enhancement Act of 2006 (HR5252) and the attendant controversy surrounding its voyage through the US Congress. Read more…

It took a little while to start to figure it out. Such things almost always do. After months of observation, research, discussion and debate, Search Engine Optimization experts appear to be getting a better handle on the effects of Google’s Bigdaddy infrastructure upgrades.

From mid-winter until this week, StepForth has strongly advised our clients to be conservative with any changes to their sites until enough time has passed for us, along with many others in the SEO community to observe, analyze and articulate our impressions of the upgrade. About ten days ago, the light at the end of the intellectual tunnel became eminently visible and SEO discussion forums are abuzz with productive and proactive conversations regarding how to deal with a post-Bigdaddy Google environment.

To make the long back-story short, in September 2005, Google began implementation of a three-part algorithm update that became known as the Jagger Update. Shortly after completing the algo update in late November, Google began an upgrading of their server and data storage network that was dubbed the Bigdaddy Infrastructure Upgrade. The Bigdaddy upgrade took several months to completely roll out across all of Google’s data centers, which are rumoured to number in the hundreds.

In other words, the world’s most popular search engine has, in one way or another, been in a constant state of flux since September. The only solid information SEOs had to pass on to curious clients amounted to time tested truisms about good content, links and site structure. Being the responsible sort we are, no good SEO wanted to say anything definite for fear of being downright wrong and misdirecting others.

Starting in the middle of May and increasing towards the end of the month, ideas and theories that had been thrown around SEO related forums and discussion groups started to solidify into the functional knowledge that makes up the intellectual inventory of good SEO firms.

Guided by the timely information leaks from Google’s quality control czar Matt Cutts , discussion in the SEO community surrounding Bigdaddy related issues has been led by members of forums WebMasterWorld , SEW ( 1 ) ( 2 ) ( 3 ), Threadwatch , SERoundtable ( 1 ) ( 2 ) ( 3 ), and Cre8asite ( 1 ) ( 2 ). SEO writers Aaron Wall , Jarrod Hunt , and Mark Daoust have also added their observations to the conversation in a number of separate articles.

By now, most good SEOs should be able to put their fingers on issues related to Bigdaddy fairly quickly and help work out a strategy for sites that were adversely affected by the upgrade. The first thing to note about the cumulative effects of Jagger and Bigdaddy is the intent of Google engineers to remove much of the poor quality or outright spammy commercial content that was clogging up their search results.

The intended targets of the Bigdaddy update go beyond sites that commit simple violations against Google’s webmaster guidelines to include affiliate sites, results gleaned from other search tools, duplicate content, poor quality sites and sites with obviously gamey link networks. In some cases, Google was targeting sites designed primarily to attract users to click on paid-search advertisements.

After the implementation of the algo update and infrastructure upgrades, SEOs have seen changes in the following areas: Site/Document Quality Scoring, Duplicate Content Filtering and Link Intention Analysis.

The first area noted is site or document quality scoring. Did you know there are now more web documents online than there are people on the planet? Many if not most of those documents are highly professional and some are sort of scrappy. While Google is not looking for perfection, it is trying to assess which pages are more useful than others and attention to quality design and content is one of the criteria.

Quality design simply means giving Google full access to all areas of the site webmasters want spidered. Smart site and directory structures tend to place spiderable information as high in the directory tree as possible. While Google is capable of spidering deeply into database sites, it appears to prefer to visit higher level directories much more frequently. We have noted that Google agent visits do tend to correspond with update times set via Google Sitemaps.

Accessibility and usability issues are thought to make up elements of the Jagger algorithm update, marking the way visitors use a site or document and the amount of time they spend engaged in a user-session associated with a site important factors in ranking and placement outcomes. Internal and outbound links should be placed with care in order to make navigating through and away from a site as easy as possible for site visitors.

Another quality design issue involves letting Google know which “version” of your site is the correct one. For most, a website can be access with or without typing the “www” part of the URL. (ie: http://stepforth.com , http://www.stepforth.com , http://www.stepforth.com/index.shtml ) This presents a rather funny problem for Google. Because links directed into a site might vary in the way they are written, sometimes it doesn’t know which “version” of a site is the correct one to keep in its cache. To Google, each of the variations of the URL above could be perceived as unique websites, an issue known as “canonicalization”, a subject Matt Cutts addressed on his blog in early January.

‘Suppose you want your default URL to be http://www.example.com . You can make your webserver so that if someone requests http://example.com/ , it does a 301 (permanent) redirect to http://www.example.com/ . That helps Google know which url you prefer to be canonical. Adding a 301 redirect can be an especially good idea if your site changes often (e.g. dynamic content, a blog, etc.).”

Quality content is a bit harder to manage and a lot harder to define. Content is a word used to describe the stuff in or on a web document and could include everything from text and images, Flash files, audio or video and links.

There are two basic rules in regards to content. It should be there to inform and assist the site user’s experience and it should be, (in as much as possible), original.

Making an easy to use site that provides visitors with the information they are looking for is the responsibility of webmasters but there are a few simple ways to show you are serious about its presentation.

Focus on your topic and stick to it. Many of the sites and documents that have found themselves demoted or de-listed during the Bigdaddy upgrade were sites that delved across several topics at the same time without presenting a clear theme. Given the option between documents with clear themes and documents without clear themes, Google’s choice is obvious.

Google is working to weed out duplicate content. Google appears to be looking for incidents of duplicate content in order to try to promote the originator of that content over the replications. This has hit sites in the vertical search sector; affiliate marketing sector, real estate sites, and even retail sites that carry brand name products, especially hard. Several shopping or product focused database sites have seen hundreds or even thousands of pages falling out of Google’s main index.

In many cases, there is little or nothing to do for this except to start writing unique content for products listed in the databases of such sites. Many real estate sites, for example, use the same local information sources as their competitors do and all tend to draw content from the same selection of MLS listings. It’s not that Google thinks this content is “useless”, it’s that Google already has several other examples of the same content and is not interested in displaying duplicate listings.

Many of the listings previously enjoyed by large database driven sites have fallen into a Google index known as the supplemental listings database. Supplemental listings are introduced to the general listings shown to Google users when there are no better examples to choose from to meet the users search query. This is the same index that is often referred to as the “Google Sandbox”.

The last major element noted in the discussions surrounding Bigdaddy is how much more robust Google’s link analysis has become. Aside from site quality and duplicate content issues, most webmasters will find answers to riddles posed by Bigdaddy in their network of links.

In order to ferret out the intent of webmasters, Google has increased the importance of links, both inbound and outbound. Before the updates, an overused tactic for strong placement at Google saw webmasters trying to bulk up on incoming links from where ever they could. This practice saw the rise of link farms, link exchanges and poorly planned reciprocal link networks.

One of the ways Google tries to judge the intent of webmasters is by mapping the network of incoming and outgoing links associated with a domain. Links, according to the gospel of Google, should exist only as navigation or information references that benefit the site visitor. Google examines the outbound links from a page or document and compares them against its list of inbound links, checking to see how many match up and how many are directed towards, and/or coming from pages featuring unrelated or irrelevant content. Links pointing to or from irrelevant content or reciprocal links between topically unrelated sites are easily spotted and their value to the overall site ranking downgraded or even eliminated.

The subject of links brings up an uncomfortable flaw in Google’s inbound link analysis that is being referred to as Google Bowling. As part of its scan of the network of links associated with a document or URL, Google keeps a detailed record of who links to who, how long the link has been established, if there is a recip link back, along with several other items.

One of those items appears to be an examination of how and why webmasters might purchase links from another site. While bought-links are not technically a rankings killer, a bulk of such links purchased from un-relevant sites in a short period of time, can effectively destroy a site or document’s current or potential rankings. In an article published at WebProNews last autumn, Michael Perdone from e-TrafficJams speculates on the issue.

Google has tried to deal with the predatory practice of “Google Bowling” by considering the behaviour of webmasters whose site have seen a number of inbound links from “bad neighbourhoods” suddenly appear. If a site that has incoming links from bad places also has or creates links directed out-bound to bad places, the incoming links are judged more harshly. If, on the other hand, a website has a sudden influx of bad-neighbourhood links but does not contain outbound-links directed to bad places, the inbound ones might not be judged as harshly.

The combination of the Bigdaddy upgrade and the Jagger algorithm update have made Google a better search engine and are precursors to the integration of video content and other information pulled from other Google services such as Google Base, Google Maps and Google Groups in the general search results.

Before the completion of both, Google’s search results were increasingly displaying a number of poor quality results stemming from a legion of scraped content “splog” sites and phoney directories that had sprung up in efforts to exploit the AdSense payment system. Bigdaddy and Jagger is a combined effort to offer improved, more accurate rankings while at the same time, expanding Google’s ability to draw and distribute content across its multiple arms and networks.

Moving forward, that is what users should expect Google to do. Google is no longer a set of static results updated on a timed schedule. It is constantly updating and rethinking its rankings, especially in light of the number of people trying to use those rankings for their own commercial gain.

The effect of the duo upgrades seems to be settling out. Credible, informative sites should have nothing to worry about in the post-Bigdaddy environment. As Google is trying to move into the most mainstream areas of modern marketing, credibility is its chief concern. The greatest threat to Google’s dominance does not come from other tech firms. It comes from the users themselves who, if displeased with results being shown by Google, could migrate en masse to another search engine.

They say it started in the autumn of 2004 in an online chat room. It ended on Saturday with the arrest of twelve adults and five minors by police and security services across southern Ontario. In a massive show of brute force and imaginative investigative cooperation between law enforcement agencies, Canadian security officials shut down a homegrown group of terrorists allegedly planning one or more attacks similar to the one that destroyed the Federal Building in Oklahoma City eleven years ago.

Information beyond the names and addresses of those arrested is obviously difficult to obtain at this time though media speculations suggests one of the intended targets was the office complex beside the Metro Toronto Convention Center, the location of the annual Toronto Search Engine Strategies Conference, which houses the Toronto branch of CSIS (Canadian Security Intelligence Service) and the RCMP (Royal Canadian Mounted Police). Read more…

Google Adwords Certified Partner Member of SEO Consultants Directory EMarketing Association