Request a Quotation
 

This SEO answer is a follow-up to a common question I received from my recent article “The Most Common Reason for Dropped Rankings: Duplication“.

To be as clear as possible with my answer I am going to break it down into two sections: one for those who syndicate their own content and the other for those who publish syndicated content.

Please keep in mind, however, duplicate content is not an exact science or anything close to that. As I wrote this article, I often imagined exceptions where a penalty would or would not take place. That said the following answers are based on what my experience dictates to be the most common scenarios. Read more…

Gravatar
Tuesday, February 6th, 2007

A Few Questions and Answers

Online Forums are always a great place to find bits and pieces of information. Below are variations of questions I have found posted in some popular SEO forums, along with my answers.

1 – Q. The last Google update saw a number of my site pages find their way into the supplemental index. Most of these pages have a near duplicate version for “print this page” options. How do I fix this?

A. The answer is relatively straight forward, but it will most likely take some time to see any of these pages removed from the supplemental indexing. The best bet is to first block these printer friendly versions from the search engines. You really don”t want these pages ranking well in the first place as they will likely have much if not all of your main site navigation removed, so having a visitor land on this page would not be of much use anyways. There are a number of ways to block search engine spiders from viewing and indexing a page. Here are two of the most commonly used:


Robots Meta tag
Using the robots meta tag is very simple. To prevent search engine spiders from indexing a given page, add the following meta tag to the start of your <head> section:

meta name=”robots” content=”noindex, nofollow”

Robots.txt file

You can also use the robots.txt file to block spiders. If your printer friendly pages are in a specific folder, you can use this code in your robots.txt file to block googlebot.

User-Agent: Googlebot
Disallow: /printer-folder/

I also recommend adding the “rel=”nofollow”” attribute to all links that direct to the printer friendly versions. This will tell the spiders to ignore the page and the link, which will not only help to prevent the printer friendly page from being indexed, it will also slightly reduce the Page Rank leak. Even if you do use this method, I still highly recommend using one of the other two methods of blocking the spiders to ensure that these pages do not become indexed.

Ultimately, assuming that the original HTML version of these pages has substantial original content, you will hopefully start to see the supplemental status stripped away. Blocking the spiders should also help prevent future new printer friendly pages from causing you more grief.

While taking these steps, it may help, but nothing is guaranteed. The reason pages become supplemental is essentially because you have other pages on your site that are better suited for related rankings. If you have two pages about a specific topic, and page A is highly targeted, and page B is only loosely targeted, then you stand the chance of Page B becoming supplemental. Add original content, and work on increasing links to this page to help out with the supplemental issue.

2 – Q. Does Google Use WHOIS to help eliminate spam from those webmasters with dozens, and even hundreds of sites?

A. Google certainly has the ability to read through WHOIS and flag multiple sites with the same owners. While it is yet to be proven 100% that Google uses WHOIS data to connect spam websites, this is certainly within the realm of possibility, and if they do not use it today, will likely use it in the future.

It is also known that a sites age can help in terms of rankings. Where does Google get this age? It could be from either the day the site was first indexed, or from the WHOIS data. The longer a site has been online, the better its chances of successful rankings, at least assuming a number of other factors such as links, relevancy, etc, all ring true.

We have seen examples where registering a new domain for no less than 2 years can (sometimes) help reduce the time spend in the “sandbox”, as it displays to Google that it is less likely SPAM. Keep in mind of course, that a 2 year + registration is not enough on its own merit.

3. Q. How do I get my site indexed by Google, Yahoo, and MSN? Should I regularly submit my URL?

A. While the answer to this question is fairly simple, it is surprising how many do not have a clear answer. I see this and similar questions in the forums quite often and thought it was pertinent to mention it here.

First things first, do NOT regularly submit your site to the engines. When it comes down to it this is something you will never need to do (nor should you ever pay anyone to do for you). There is only one instance where a submission to the major engines is okay, and that is after the official launch of a brand new site on a brand new domain.

Before you submit your site check to make sure you are not already indexed. You may be surprised how quickly the major engines can find you. If you are not indexed, then one free site submission is alright. After you have made this submission, forget the option even exists, as you will never need to do this again.

To get the site indexed this will typically work in time, but the best way to have your site not only indexed, but also ranked, is to work on your incoming links, and also consider creating and submitting an XML sitemap. Google, Yahoo and MSN are good at finding sites, and they will index you on their own even if you only have a few in bound links.

Hello all, I wanted to clear up a significant issue with my recent article “The Most Common Reason for Dropped Rankings: Duplication“. It was edited closely but apparently not quite close enough. Please note the following change in a question within the article: Read more…

QUESTION: My client originally promoted a single .co.uk domain that he owned. Recently he purchased a .com and pointed that domain to his current website. Since this change we have noticed his “pages from the UK” content has been dropped from Google UK but the .com is performing well on Google.com under the client’s target keywords. What is going on? – G.S.V.

ANSWER: I see no sure answer why this has happened without more information. First things first, the .com website will get attention from Google.com simply because non-regional TLD’s are favoured at Google.com. Also, the fact that your client’s site got excellent rankings is a testament to the quality optimization of the site (even if you do not want these rankings); so kudos to you if you were the one who optimized the site.

Understanding why the .co.uk dropped in the UK regional rankings seems the tough question. Here are some things to check on:

  1. Was the .com 301 redirected to the .co.uk? A 301 redirect effectively tells the search engines that they should pay attention to the destination domain (.co.uk) versus the domain the spider originally entered at (.com). If you were to enable a 301 redirect now you might save yourself a lot of confusion and potentially pain in the future; since this technique undeniably states which domain represents the flagship website and will limit duplicate content penalties.
  2. Did the .com have a prior history? Perhaps it was bought the .com had a significant number of backlinks or history that outweighed the .co.uk domain. You see, I expect when Google is presented with two domains pointing at the same content it will choose to rank the domain with the most positive history. That is of course, if no other directives have been stated (i.e. 301’s). A way to see if the domain had a history before it was bought is to use the Wayback Machine and see if a prior site existed. Next you should do a backlink check for the domain to see if there are any links that came with the ‘new’ domain.
  3. Is the website hosted in the USA or the UK? If the .co.uk and the .com are both hosted on an American server then achieving a ranking on google.com will be significantly easier than google.co.uk and vice versa. In other words, host in the UK and use a .co.uk domain if you want to be sure to have regional UK rankings.

At the moment these are the most prominent possibilities that come to mind but there are likely more. The fact is, if all else fails and it appears everything is normal I find issues like oddly missing rankings fix themselves over time. I hope your outcome is extremely positive and I do hope you keep me up to date.

If anyone else has experienced this issue or has some educated feedback please post a comment within this posting on The SEO Blog.

PS. Here is a great forum thread at Search Engine Watch discussing Google.com vs. Google.co.uk rankings.

QUESTION: How do I edit my website description on Google? Please direct me to the correct place. – Barb C.

ANSWER: There are three ways your website description might have been created by Google and fortunately each method has a solution which I have outlined below: Read more…

QUESTION: When a high PR page within a 3rd party website links to a page within my website where is the benefit placed… on my home page or my page that was linked to? – Jose U.

ANSWER: The home page and the linked page benefit from the link… but to different degrees. The majority of the weight is applied to the linked page because it is the page that effectively deserved the vote of confidence but it also counts positively towards the integrity and credibility of your whole website; which in essence is represented by your home page.

Please note, this answer is totally dependent upon the quality of the backlink you received. For example, links from websites that are unrelated or have poor credibility will offer little or no benefit. For more information on what constitutes a ‘good’ backlink see my answer to a recent question from a reader: “What exactly are good backlinks?”.

Three SEO questions are answered in this Q&A article:

  1. Do search engines ignore stop words in domain names?
  2. I created a duplicate website to target my services to a different state. My intention is not to dupe Google but I don’t know what else to do. How would you approach this?
  3. What can I do to increase the number of backlinks I am getting from articles?

Question 1) ” I know search engines ignore stop words in meta tags and title tags. Do search engines ignore stop or common words in domain names? Example www.therealestate.com or www.arealestate.com ” – Corey M.

Answer: Yes, to the degree in which the keywords have any effect the more common words within a domain name would be ignored. It is, however, important to keep in mind that search engines only place a limited amount of weight on keywords in a domain name. In my opinion the only time where a site keyword within a domain name wins a ranking war is when all other elements are equal between you and near-ranked competitor.

Additional Info: I like to do my due diligence before answering any question because frankly I need to be sure the rules have not changed overnight. Here are a couple links to pages that relate to this question:

Question 2) “ Hi Ross – I just finished reading your blog… “SEO Answers #7: ” What Determines Duplicate Content SPAM? “and I have a question for you. Allow me to give you a little background: One of my clients has expanded their business into another state with a different name, but it is ultimately the same business. In order to develop a Web presence for this new, duplicate company in a new area, we created a second Website that has its own unique design, etc, but is ultimately a take off on the original site, using the same content, just minor differences to allow for the new name, geographic area, etc. The original site is optimized and of course contains the original content. The duplicate site is not optimized. In no way is our intention to “dupe” anything, but will this cause problems with the engines? We are not trying to get mileage off of the content by duplicating it… this is simply a second company that offers the same services, just in a different state. Each site is in its own domain and has its own URL. Your input? Your suggestions for a different solution? Thanks for your time.”

Answer: Simply put if you are truly not looking to get any mileage out of the content then you do not want rankings for it which implies that the site should be blocked to the search engines. I would recommend using your robots.txt file to block the spiders entirely to that website so you do not negatively affect your rankings on the original site. The fact is that duplicate content, good intentions or not, is frowned upon by the search engines and you are gambling by having the site available for spidering.

If, however, you do want the search engines to spider the content then you must rewrite it to avoid duplicate content penalties.

As a final note I would like to pose a question; did your client absolutely have to create a secondary website? In many cases I find that a client need not have created a second site; they just needed to add a new section to their site to manage the new target marketplace. The other option, which may have been appropriate in this case, is to add a subdomain to enjoy the benefits of a secondary home page on a pre-branded domain and fresh marketable URL; nearly the same benefits of having a secondary domain without the headache of marketing an entirely new website. Even in this case, however, you would not be able to use duplicate content so you would be faced with the same issues; either rewrite the content or block it from spiders.

Question 3) “Ross, in an attempt to improve my ranking in the search engines, I have been writing articles for article directories in the hopes of receiving quality backlinks. Recently I came across a site mypagerank.net, which I decided to check on my link popularity. The result indicated that I only had 18 backlinks. What can I do to increase the number of backlinks I am getting? I would have expected more as I have written many more articles and submitted my URL to many directories. Thanks, enjoy reading your articles” – Peter

Answer: First, good work making the effort to write articles, they are an excellent medium for promotion and I commend you for dedicating the time to writing. I certainly understand the significant commitment to time and research required to write usable content. Fortunately, there are a few techniques that may help you squeeze some extra benefits from your hard work:

  1. Be Clear – Request Credit
    Are you being very clear to those republishing your content that you expect a linked credit for the copy? Simply stating that you allow syndication but request credit laid out in a particular linked format will do wonders. At StepForth we clearly request credit and we occasionally troll Google looking for those who have republished our content without credit; it is usually a simple matter to have the content removed or the appropriate link added. Hence, if you see this article does not give credit and provide a link to StepForth.com then please drop me a note, ross@stepforth.com :-)
  2. Pick a Powerful, Timely Topic
    Have you noticed a topic coming up regularly in forums? Perhaps a question that appears to be asked regularly? This is usually a good indication that an article discussing the topic would do well. Remember that many of the syndication networks online are looking for topics that will get readers and ultimately provide impressions for their advertisers. As a result, picking a hot topic will make a world of difference in how widely your article will get picked up.
  3. Optimize the Title
    The title of your article needs to clearly relay the topic and should engage readers and editors alike. The title can make or break a story if it is too vague or boring.
  4. Refer to Your Own Content
    It helps to provide inline links from your article to relevant previous articles or pages on your website. With practice and once you have built up a healthy reservoir of linkable articles it will soon be second nature to refer to links in a manner that is crucial to the article; so that editors note that the backlinks are relevant and play a legitimate role in the purpose of the article. Legitimacy of backlinks is crucial to make the ‘cut’ because editors are more likely to remove a link than keep one if it appears merely promotional.
  5. Give Praise Where it is Due
    Within an article don’t be afraid to link to other sources where you have noted particularly good information, particularly other small business blogs. The fact is that some bloggers take a real shine to those who syndicate or give credit to their content and may just link back to you in thanks.
  6. Put RSS On Your Side
    If you have not already done so ensure that your articles are syndicated on your website in RSS format as well. This can be easily accomplished by using a blog to publish your articles because most blog systems include automated feed creation. Many of the article syndication networks use articles solely through RSS. So as soon as you get a RSS feed you should go out and tell the world about it. Try using Google and search for “submit feed” or ” add blog ” and you are certain to find some great sites to submit your syndication feed.

There is a lot more information on this topic so I will provide some links that should help you further:

After all this work is done, keep in mind not every proper (credited) use of your article will deliver the benefits of a backlink. The search engines are frankly too smart to give credit to every article link because of the obvious duplication and the unfortunate proliferation of article scraping sites (sites that republish articles to try and make their sites appear authoritative). That said, the links acquired from truly authoritative sites would pay off as backlinks and hopefully with the more important benefit – direct traffic.

As a final note remember that articles are meant for human consumption so be sure to proofread your work and ensure that the topic is either timely or originally written enough to be useful. I am not saying this applies to you, however, it is important that the quality of the content is high enough to merit wide spread syndication.

by Ross Dunn – CEO, StepForth Search Engine Placement Inc.
Permalink to this Article: SEO Questions #9

QUESTION: I have just started my own design company and although very well trained in both designing and programming, earning two associate degrees in this field, not one professor ever said anything about making your websites search engine friendly. I recently designed a website for my sister and i cannot even get her site to show up in any search engine. I have several keywords at the top including a description as well. one problem may be that the index page is sort of a splash page except it is just a handler that detects whether or not the user has flash installed and whether or not they have the bandwidth to view the flash page accurately. it then redirects them to a new page based on the feedback. therefore, there is no real content on the index page. Another possible problem is that she is mentioned on hundreds of other websites. do you have any suggestions for me? Any advice would help. — Laura P. Read more…

QUESTION: We’re a very small company with an 11 year website history, with web development resources somewhere between quite miniscule and non-existent. Nonetheless, SEO has been a keen focus of awareness since before it was called that, and up until that infamous “Florida” event 3 or so years ago, we did very well in the SERPs. Over the years a number of people have worked on the code comprising our site, and while there is nothing egregiously, obviously wrong with our content, no one knows if now we’re being penalized for something ‘lurking’ in our code that may be left over from yesteryear and never found and rooted out. The biggest worry and source of disagreement seems to involve “duplicate content”. Read more…

Two excellent questions are answered in this article:

A.) Which steps should I follow when optimizing my dynamic website?

B.) Can you maintain the integrity of my design while implementing SEO?

QUESTION A) Which steps should I follow when optimizing my dynamic website? – Chriz R. Read more…