Gravatar
Tuesday, February 6th, 2007

A Few Questions and Answers

Online Forums are always a great place to find bits and pieces of information. Below are variations of questions I have found posted in some popular SEO forums, along with my answers.

1 – Q. The last Google update saw a number of my site pages find their way into the supplemental index. Most of these pages have a near duplicate version for “print this page” options. How do I fix this?

A. The answer is relatively straight forward, but it will most likely take some time to see any of these pages removed from the supplemental indexing. The best bet is to first block these printer friendly versions from the search engines. You really don”t want these pages ranking well in the first place as they will likely have much if not all of your main site navigation removed, so having a visitor land on this page would not be of much use anyways. There are a number of ways to block search engine spiders from viewing and indexing a page. Here are two of the most commonly used:


Robots Meta tag
Using the robots meta tag is very simple. To prevent search engine spiders from indexing a given page, add the following meta tag to the start of your <head> section:

meta name=”robots” content=”noindex, nofollow”

Robots.txt file

You can also use the robots.txt file to block spiders. If your printer friendly pages are in a specific folder, you can use this code in your robots.txt file to block googlebot.

User-Agent: Googlebot
Disallow: /printer-folder/

I also recommend adding the “rel=”nofollow”” attribute to all links that direct to the printer friendly versions. This will tell the spiders to ignore the page and the link, which will not only help to prevent the printer friendly page from being indexed, it will also slightly reduce the Page Rank leak. Even if you do use this method, I still highly recommend using one of the other two methods of blocking the spiders to ensure that these pages do not become indexed.

Ultimately, assuming that the original HTML version of these pages has substantial original content, you will hopefully start to see the supplemental status stripped away. Blocking the spiders should also help prevent future new printer friendly pages from causing you more grief.

While taking these steps, it may help, but nothing is guaranteed. The reason pages become supplemental is essentially because you have other pages on your site that are better suited for related rankings. If you have two pages about a specific topic, and page A is highly targeted, and page B is only loosely targeted, then you stand the chance of Page B becoming supplemental. Add original content, and work on increasing links to this page to help out with the supplemental issue.

2 – Q. Does Google Use WHOIS to help eliminate spam from those webmasters with dozens, and even hundreds of sites?

A. Google certainly has the ability to read through WHOIS and flag multiple sites with the same owners. While it is yet to be proven 100% that Google uses WHOIS data to connect spam websites, this is certainly within the realm of possibility, and if they do not use it today, will likely use it in the future.

It is also known that a sites age can help in terms of rankings. Where does Google get this age? It could be from either the day the site was first indexed, or from the WHOIS data. The longer a site has been online, the better its chances of successful rankings, at least assuming a number of other factors such as links, relevancy, etc, all ring true.

We have seen examples where registering a new domain for no less than 2 years can (sometimes) help reduce the time spend in the “sandbox”, as it displays to Google that it is less likely SPAM. Keep in mind of course, that a 2 year + registration is not enough on its own merit.

3. Q. How do I get my site indexed by Google, Yahoo, and MSN? Should I regularly submit my URL?

A. While the answer to this question is fairly simple, it is surprising how many do not have a clear answer. I see this and similar questions in the forums quite often and thought it was pertinent to mention it here.

First things first, do NOT regularly submit your site to the engines. When it comes down to it this is something you will never need to do (nor should you ever pay anyone to do for you). There is only one instance where a submission to the major engines is okay, and that is after the official launch of a brand new site on a brand new domain.

Before you submit your site check to make sure you are not already indexed. You may be surprised how quickly the major engines can find you. If you are not indexed, then one free site submission is alright. After you have made this submission, forget the option even exists, as you will never need to do this again.

To get the site indexed this will typically work in time, but the best way to have your site not only indexed, but also ranked, is to work on your incoming links, and also consider creating and submitting an XML sitemap. Google, Yahoo and MSN are good at finding sites, and they will index you on their own even if you only have a few in bound links.

It is my pleasure to introduce to you Scott Smith, partner at CopyWriting.Net and copy writing guru. Before you read any further, however, I want to give you a heads up that this is not a normal article from StepForth. This is an unedited interview written in a very personal style. In other words, this is not the normal condensed knowledge that my staff and I try to put out every week. That said, I really wanted to introduce you to Scott because I feel his copywriting skills are top of the line and either his services or his tips may help you shore up your bottom line. If you want to get in touch with Scott Smith he is best contacted by email. Read more…

Hello all, I wanted to clear up a significant issue with my recent article “The Most Common Reason for Dropped Rankings: Duplication“. It was edited closely but apparently not quite close enough. Please note the following change in a question within the article: Read more…

Gravatar
Wednesday, January 24th, 2007

Wikipedia Links Useless for SEO

As reported in Search Engine Journal, in an attempt to eliminate spamming to Wikipedia, effective immediately all outbound links from the internet giant will have the “nofollow” tag appended. The “nofollow” tag was introduced a while back for webmasters to tell the major search engines to ignore the specific link. When Google sees this tag, the outbound link is passed by as if it were regular text.

What does this mean for site owners? If you have links pointing in from Wikipedia they will be lost, at least in terms of helping with your SEO campaigns. Links come and go all the time, but to lose a Wikipedia link is a big deal as it is a highly regarded site in the eyes of the search engines and its credibility with Google would mean the link would have a significant ranking value. For small sites with few links and good rankings, a loss of a Wikipedia link could have significant impact on rankings.

Internet marketing consultant and blogger Andy Beal is not going to take this sitting down and has launched a campaign in an attempt to reduce Wikipedia’s Page Rank down to zero. He suggests that to dispute the decision that all webmasters who have links directed at Wikipedia append the “nofollow” tag themselves to give Wikipedia a taste of their own medicine. Beal does go on to mention that his site does not have any incoming links from Wikipedia and that this campaign is based entirely on principle.

Wikipedia was made popular due to the vast numbers of incoming links it has gained over the years and if enough linking webmasters adding the “nofollow” tag it would certainly cause it to ultimately drop. Currently Wikipedia’s English home page has more than 1.5 million incoming links noted by Google. It would take an incredible feat to have their popularity decline as a result of “nofollow” tags, but it is still within the realm of possibility.

QUESTION: My client originally promoted a single .co.uk domain that he owned. Recently he purchased a .com and pointed that domain to his current website. Since this change we have noticed his “pages from the UK” content has been dropped from Google UK but the .com is performing well on Google.com under the client’s target keywords. What is going on? – G.S.V.

ANSWER: I see no sure answer why this has happened without more information. First things first, the .com website will get attention from Google.com simply because non-regional TLD’s are favoured at Google.com. Also, the fact that your client’s site got excellent rankings is a testament to the quality optimization of the site (even if you do not want these rankings); so kudos to you if you were the one who optimized the site.

Understanding why the .co.uk dropped in the UK regional rankings seems the tough question. Here are some things to check on:

  1. Was the .com 301 redirected to the .co.uk? A 301 redirect effectively tells the search engines that they should pay attention to the destination domain (.co.uk) versus the domain the spider originally entered at (.com). If you were to enable a 301 redirect now you might save yourself a lot of confusion and potentially pain in the future; since this technique undeniably states which domain represents the flagship website and will limit duplicate content penalties.
  2. Did the .com have a prior history? Perhaps it was bought the .com had a significant number of backlinks or history that outweighed the .co.uk domain. You see, I expect when Google is presented with two domains pointing at the same content it will choose to rank the domain with the most positive history. That is of course, if no other directives have been stated (i.e. 301’s). A way to see if the domain had a history before it was bought is to use the Wayback Machine and see if a prior site existed. Next you should do a backlink check for the domain to see if there are any links that came with the ‘new’ domain.
  3. Is the website hosted in the USA or the UK? If the .co.uk and the .com are both hosted on an American server then achieving a ranking on google.com will be significantly easier than google.co.uk and vice versa. In other words, host in the UK and use a .co.uk domain if you want to be sure to have regional UK rankings.

At the moment these are the most prominent possibilities that come to mind but there are likely more. The fact is, if all else fails and it appears everything is normal I find issues like oddly missing rankings fix themselves over time. I hope your outcome is extremely positive and I do hope you keep me up to date.

If anyone else has experienced this issue or has some educated feedback please post a comment within this posting on The SEO Blog.

PS. Here is a great forum thread at Search Engine Watch discussing Google.com vs. Google.co.uk rankings.

QUESTION: How do I edit my website description on Google? Please direct me to the correct place. – Barb C.

ANSWER: There are three ways your website description might have been created by Google and fortunately each method has a solution which I have outlined below: Read more…

As an SEO I am asked a number of questions covering a broad range of SEO related topics and one question in particular is asked quite often. This question holds answers which, when ignored, could see a once well ranked website spiral into depths of the search engine rankings forever.

“I am in the process of redesigning my site, what should I look out for in
order to maintain the SEO (and rankings)?” Read more…

Gravatar
Thursday, January 4th, 2007

StepForth’s Predictions for 2007

Another New Years has come and gone and over the past few weeks search industry professionals have been releasing their search market predictions for 2007. I have steered clear of reading them because it is time for me to write down StepForth’s predictions and the last thing I want to worry about is duplication. Without further adieu, here are the predictions my staff and I put together for 2007. Read more…

Gravatar
Wednesday, December 13th, 2006

Stake Your Claim on the Mobile Web

With the Internet growing so rapidly do you ever wonder if you are missing a new trend or technology that could boost your bottom line? Well, there just happens to be a piece of the Internet that I bet you haven’t made the leap to yet and it is going to be BIG. This new space is mobile search and mobile Internet surfing. Read more…

QUESTION: I have just started my own design company and although very well trained in both designing and programming, earning two associate degrees in this field, not one professor ever said anything about making your websites search engine friendly. I recently designed a website for my sister and i cannot even get her site to show up in any search engine. I have several keywords at the top including a description as well. one problem may be that the index page is sort of a splash page except it is just a handler that detects whether or not the user has flash installed and whether or not they have the bandwidth to view the flash page accurately. it then redirects them to a new page based on the feedback. therefore, there is no real content on the index page. Another possible problem is that she is mentioned on hundreds of other websites. do you have any suggestions for me? Any advice would help. — Laura P. Read more…