Gravatar
Wednesday, April 13th, 2005

The Web We Weave, Linking for Google: April 2005

 

Over the past week, SEOs and SEMs have noted some significant changes in the search engine results delivered by Google. Google appears to be actively cleaning its listings by targeting sites using suspicious link-building techniques. A couple of well-known search engine marketing sites have vanished from Google results under keyword phrases they dominated just last week.

The sudden disappearance of these sites, along with a notable difference in search results under other highly competitive phrases has led many in the SEO/SEM industry to conclude Google has implemented some of the spam-link busting filters outlined in their 63-point patent document published two weeks ago. After examining results displayed at Google since Friday April 8, we too are drawn to this conclusion. In other words, something has changed in the way Google ranks sites. Given a lack of any other credible information, we are looking toward the sorting methods and ranking techniques Google has protected under U.S. and international patent laws to provide details.

As stated in previous articles, one thing to be very clear about is that nobody except a very small number of Google engineers can claim to know the exact variables Google uses to populate its ranking algorithms. We do know how Google and other spider-driven search engines operate, how they operated in previous months or years, and the outcomes those operations have produced historically over time. Having watched search engines for years, experienced SEO and SEM firms can make such predictions and assumptions with some degree of accuracy. After all is said and done, the proof is always in the pudding, so to speak, and our predictive assumptions are either proven or shown false in the search engine results pages.

This time, the big “trigger target” for Google appears to be links. As anyone who has followed search engine optimization techniques knows, Google puts a lot of stock in the value of links between documents. PageRank remains the core concept of Google’s general algorithm though the weights and measures used to determine “page rank” as we understand it have changed radically over the years.

Back in the earliest days, one link equaled one positive vote, a rather clean sorting concept that worked extremely well in a much cleaner Internet environment. As Google rose to become the dominant search engine, the search marketing industry started to focus on Google. An amazingly vast pool of brainpower started to deconstruct every nuance in the basic algorithm, making changes, shifts or additions to the algorithm cause for lively discussion and analysis at any one of a dozen search marketing discussion forums. A very small number of Google search engineers, no matter how extraordinarily intelligent they are as individuals or collectively, simply can’t keep up with the SEO/SEM industry without resorting to making sweeping change to the core-algorithm periodically . If Google ever loses its dominance in the sector, the next search firm to dominate will, without question, face similar concerns. We have seen similar algo-updates in the past, the greatest being the Florida Update of November 2003. This week’s update was not nearly as severe as Florida , at least not yet. Given that this suspected update is based on measuring the value of specific links, it might be weeks or even months before we see the full results.

If you or someone you know has been engaged in a link-building plan that relies on link trading between multiple sites that don’t actually relate to or do business with each other, you might want to take a few hours to examine your link-building strategies.

About four weeks ago, an article appeared in Wired Magazine telling the world how simple it was to game Google by bulking up on links. The article became a focal point for discussion in many circles and might be inadvertently responsible for a notable rise in the number of link-trading email spam offers. It may have also alerted Google that it was high time to implement a number of new link-evalutation filters designed to separate the good from the bad. This idea has been the subject of a few recent articles and is backed up by several sections of the 63-point patent document.

To recap the central theme of the patent document, Google compiles document profiles based on the historic data of several elements relating to every URL in its index. The historic data included in that profile plays a determining factor in various scores, or points Google assigns documents when generating keyword driven search results. It is therefore easy to extrapolate the concept that the recent update is based on historic data in regards to links.

It is also easy to extrapolate another assumption, though this one is a bit of a stretch. There has not been a visible backlink or subsequent “PageRank” update in months. Together these two thoughts might indicate that Google’s index has become a lot more fluid with micro-updates that affect unique sets of document profiles as opposed to massive updates that could put the entire index in flux for weeks at a time.

Link building is and should always be an essential part of the search engine optimization process. All spider driven search engines find new documents by following links. This is the underlying concept of the “world wide web” analogy. Linkage between documents is actually what the web was built for. Google will therefore value these links as long as the web exists. In this way, Google is a victim of its own success. It is the world’s most popular search engine and it values links more than any other search engine. It stands to reason that the hyper-brainiac forces of the SEO/SEM world spent a lot of time figuring out elaborate link-generation schemes. These schemes, by the way, are pretty far from the spirit of the evolving web, as I understood it a decade ago. Good links made a useful web. Links designed primarily get attention under multiple keyword phrases are not so good. Perhaps the “O” in SEO should also represent “organic”. Google really appreciates links that develop ORGANICALLY.

Bob links to Jane because Bob thinks Jane has information relevant to viewers of Bob’s document. As it turns out, Bob was right and the anchor text he used to phrase the link accurately represented the content found on Jane’s document. Bob was not paid to link to Jane. As a matter of fact, Bob expects nothing in return except perhaps a better environment for his site-visitors. Both Bob and Jane score good points in their document profiles and everyone lives happily ever after in a naive representation of an intellectual nirvana.

As the web works today, Jane is almost certainly selling something to pay for the high cost of providing good information while retaining the ability to pay her mortgage. Jane therefore benefits from a link provided by Bob and wants to get as many as she possibly can knowing that if she ranks higher than anyone else, she will likely make more sales. The moment Bob sees Jane building links for financial benefit; he starts to think of what he can get in return for a link. An industry built to game Google is thus born and Google engineers start to worry about how their link-driven results are perceived by the search-surfing public.

Google is using a number of logical measures to both predetermine and actively-determine the value of each and every link it follows. Google is interested in the long-term behaviour of links and compiles a life-cycle analysis of links as part of the document profiles associated with all documents in its index.

Here are a few observations and questions for performing link analysis. While there is no proof Google will or will not consider these points in relation to a document at any given time, there is plenty of evidence that webmasters and search marketers should at all times. According to a number of sections of the patent, (particularly those numbered in the 50′s) Google is capable of taking a much wider analysis of links and their purpose than previously thought.

When new links are added, Google examines how their appearance or disappearance affects other links associated with the document. When a link appears is important to Google. If a number of links appear to a new or existing document at once, Google would like to be able to easily gauge the value of each of those links. One of the ways it does that is by date. When did the link appear? What other links were present on the document the link came from when the link appeared? How does the presence or disappearance of various links on that document affect the relevancy of the document or the perceived relevance to the document it points to?

How do documents networked by links relate to each other over time?
Links can change over time. Google wants to be able to judge if a link is seasonal or time driven as part of its weighing criteria. One of the ways it judges time, seasonal or event driven linkage is by trends associated with documents connected by links. Are there similar link-trends shared by documents that are linked together?

What date did the fresh link appear?
When did Google notice a fresh link exists? The date Google becomes aware of a link is a benchmark date. Google compares a number of other factors against that date in the profiles of documents associated with that link.

What anchor text was associated with the link?
Google uses anchor text as a relevancy determinant. A link using “blue widgets” as its anchor text should therefore link to a document directly associated with blue widgets.

When did links directed to a document start using specific keyword phrases as anchor text?
Again, Google refers to a benchmark date. In this case, it compares the benchmark start date against those of other links in the document profile.

Does that anchor text change?
The next two obvious questions are, when and to what. Google uses this information to track link-campaigns and to determine link-spam advertising from active, organic links. For instance, a link with anchor text that remains static might be judged harshly if other links on the page are also static. If that same link was found on a page where other links changed from time to time, Google would take a brighter view of the value of that link.

When the anchor text of a link changes, was that change relevant to changes in document content?
If the anchor text of a link changes in relation to content on the document linked to, chances are the link was placed with care and consideration. Google would then assign a higher score. If, however, the anchor text is noted to change without any relation to the text on the document linked to, there is a chance the link is part of keyword-link branding campaign.

Google is using a number of other factors to determine the validity of links, some of which involve the behaviours of those who follow links to documents in Google’s index. Determining the value of a link also means considering if human-users think the link is valuable.

The concept of document profiles is very real. Google is making a list and checking it more than twice when determining the value of links and webs they weave. Google examines these link-webs as they relate to both individual documents and the sites they are associated with. When building, buying, placing or otherwise acquiring links in an SEO or SEM campaign, it is wise to think about what Google is going to think about that link. One thing you know for certain is that Google is going to think quite a bit about it and every other link associated with it.


Leave a Reply

Google Adwords Certified Partner Member of SEO Consultants Directory EMarketing Association