Gravatar
Tuesday, February 6th, 2007

A Few Questions and Answers

 

Online Forums are always a great place to find bits and pieces of information. Below are variations of questions I have found posted in some popular SEO forums, along with my answers.

1 – Q. The last Google update saw a number of my site pages find their way into the supplemental index. Most of these pages have a near duplicate version for “print this page” options. How do I fix this?

A. The answer is relatively straight forward, but it will most likely take some time to see any of these pages removed from the supplemental indexing. The best bet is to first block these printer friendly versions from the search engines. You really don”t want these pages ranking well in the first place as they will likely have much if not all of your main site navigation removed, so having a visitor land on this page would not be of much use anyways. There are a number of ways to block search engine spiders from viewing and indexing a page. Here are two of the most commonly used:


Robots Meta tag
Using the robots meta tag is very simple. To prevent search engine spiders from indexing a given page, add the following meta tag to the start of your <head> section:

meta name=”robots” content=”noindex, nofollow”

Robots.txt file

You can also use the robots.txt file to block spiders. If your printer friendly pages are in a specific folder, you can use this code in your robots.txt file to block googlebot.

User-Agent: Googlebot
Disallow: /printer-folder/

I also recommend adding the “rel=”nofollow”" attribute to all links that direct to the printer friendly versions. This will tell the spiders to ignore the page and the link, which will not only help to prevent the printer friendly page from being indexed, it will also slightly reduce the Page Rank leak. Even if you do use this method, I still highly recommend using one of the other two methods of blocking the spiders to ensure that these pages do not become indexed.

Ultimately, assuming that the original HTML version of these pages has substantial original content, you will hopefully start to see the supplemental status stripped away. Blocking the spiders should also help prevent future new printer friendly pages from causing you more grief.

While taking these steps, it may help, but nothing is guaranteed. The reason pages become supplemental is essentially because you have other pages on your site that are better suited for related rankings. If you have two pages about a specific topic, and page A is highly targeted, and page B is only loosely targeted, then you stand the chance of Page B becoming supplemental. Add original content, and work on increasing links to this page to help out with the supplemental issue.

2 – Q. Does Google Use WHOIS to help eliminate spam from those webmasters with dozens, and even hundreds of sites?

A. Google certainly has the ability to read through WHOIS and flag multiple sites with the same owners. While it is yet to be proven 100% that Google uses WHOIS data to connect spam websites, this is certainly within the realm of possibility, and if they do not use it today, will likely use it in the future.

It is also known that a sites age can help in terms of rankings. Where does Google get this age? It could be from either the day the site was first indexed, or from the WHOIS data. The longer a site has been online, the better its chances of successful rankings, at least assuming a number of other factors such as links, relevancy, etc, all ring true.

We have seen examples where registering a new domain for no less than 2 years can (sometimes) help reduce the time spend in the “sandbox”, as it displays to Google that it is less likely SPAM. Keep in mind of course, that a 2 year + registration is not enough on its own merit.

3. Q. How do I get my site indexed by Google, Yahoo, and MSN? Should I regularly submit my URL?

A. While the answer to this question is fairly simple, it is surprising how many do not have a clear answer. I see this and similar questions in the forums quite often and thought it was pertinent to mention it here.

First things first, do NOT regularly submit your site to the engines. When it comes down to it this is something you will never need to do (nor should you ever pay anyone to do for you). There is only one instance where a submission to the major engines is okay, and that is after the official launch of a brand new site on a brand new domain.

Before you submit your site check to make sure you are not already indexed. You may be surprised how quickly the major engines can find you. If you are not indexed, then one free site submission is alright. After you have made this submission, forget the option even exists, as you will never need to do this again.

To get the site indexed this will typically work in time, but the best way to have your site not only indexed, but also ranked, is to work on your incoming links, and also consider creating and submitting an XML sitemap. Google, Yahoo and MSN are good at finding sites, and they will index you on their own even if you only have a few in bound links.


Leave a Reply

Google Adwords Certified Partner Member of SEO Consultants Directory EMarketing Association