Gravatar
Monday, January 29th, 2007

Correction for: The Most Common Reason for Dropped Rankings

 

Hello all, I wanted to clear up a significant issue with my recent article “The Most Common Reason for Dropped Rankings: Duplication“. It was edited closely but apparently not quite close enough. Please note the following change in a question within the article:

Q: “Which website should I shut down? Is there anything I should consider first?”

A: Yes, it is very important that you choose keep the website that has the most backlinks and has been online the longest. The reason I say this is that Google tends to favour entrenched websites; they have been around a while, are well backlinked and overall appear to have a positive history.

Whatever your decision is, it is vital you understand switching a website to a new domain is a dangerous step. This is because of Google’s famed ‘sandbox’. The ‘sandbox’ is really only an overused turn of phrase that represents a portion of the Google algorithm which considers the age of the domain as a signifier of trust. Generally, new websites will require 6 months to a year before substantial rankings are evident; this is kind of a right of passage that Google appears to be enforcing on the average website. Sites that are obviously popular and quickly gain a load of legitimate link popularity will easily avoid the sandbox (because Google can not afford to miss a ‘great’ website) but this is not the common scenario.

Sorry for this confusion! I hope the rest of that sentence made the mistake irrelevant but I didn’t want to take the chance.

THANK YOU FOR YOUR QUESTIONS:
Well I guess we were right thinking this was a hot topic! I have received a deluge that I will respond to as soon as possible. Instead of answering them all one-on-one I will try to find some common issues or questions that I can respond to in a quick follow-up article which would be great for both of us.

Cheers, Ross Dunn


One Response to “Correction for: The Most Common Reason for Dropped Rankings”

  1. Anonymous

    Duplicate content is part of the web. It serves the community when content is duplicated and placed in various places. The search engines are the ones that have to rank sites based on all the factors they deem important. So, for that reason duplicate content is not a “bad” thing for a commercial enterprise just a non-helpful thing. If I duplicate my entire website and except for domain specific internal links there is no tie (no link) between them then the rankings for site #2 will be affected by how much weight the other factors have.

    All other things being equal, if site #2 is better optimized than site one, with the exception of duplicate content, then it should and probably will rank better.

    As for using replicated content from other sources. If those sources allow it then it is a good thing to provide that content. I don’t believe for a second that Google is penalizing a site for having a copy of an article from another website. That would be detrimental to the nature of the web and not fair as Google cannot possibly know whether that content is allowed there on purpose.

    This is why Google says that they factor on 100 or so things to determine rank. They have to because no one or ten things by itself is enough to do it accurately. You can speculate all you want to about duplicate content and the value of links but the truth is that no one really knows for sure.

    If I have ten websites on the net with all the same content then I don’t expect all ten of them to be in the first ten results. I hope that one or two of them will make it. I notice however that my sites with the same content will all rank differently for different terms. How do you explain that? I explain it with this, each of those sites has gotten it’s own identity on the net through just being on the net. Therefore each one has a different set of keywords attached to it that are assigned by the search engines. And each has a rank within each of those keywords. You are most likely correct in your assessment that in cases of duplicate content the site with the most age ranks higher. So what? Neither site is penalized, they are ranked just where they should be.

    One of my competitors went live on the web at almost the same time I did. We use the same shopping cart and same structure. Most of our content is duplicated or very close. We trade spots in the top three of results for every important keyword. A few other competitors that are much younger have achieved top five rank and occassionally rank above us in the top three. This is because they have optimized out the ass in other factors. We still have senority in our category however which is what keeps us up there, I believe.

    I do not advocate that commerical sites looking for rank should delete their duplicate websites. These are like stores across the landscape. While they are probably no help for rankings they are definitely “open for business” and people will find them and use them. The only way that I can see that there would be a penalty is if Google can draw a definite line between the sites and then determines that the website exists for the sole purpose of manipulating google’s (or others) rankings. Failing that, there is not a reason to penalize, just to catergorize and rank accordingly.

Leave a Reply

Google Adwords Certified Partner Member of SEO Consultants Directory EMarketing Association