Request a Quotation
 

StepForth’s Blogs 101 is a resource to provide our clients and readers with a clear concept of what a blog is, why a blog might be a positive addition to their website or marketing campaign, and how to implement, optimize and promote a blog. In Part 1 of this series I will discuss the basics of a blog and some of the necessary steps to take before starting one. Read more…

Gravatar
Wednesday, May 9th, 2007

Google and the "www" Issue

It would appear as if both www and non-www versions of websites are no longer being treated by Google as separate pages, at least where Page Rank is concerned.

We know that the visible PR in the tool bars is only a loose indication of a site’s actual PR, but it can still be frustrating when you see a low PR for the non-www version, and a higher PR for domains with the www. Recently we have noticed that the PR displayed is now the same for both versions.

I have checked out a few sites where I know the non-www version was displaying a lower PR, and now all are equal. This is certainly a good thing as it is likely a step in the right direction where having both versions will not present any duplicate content issues.

Until such time that duplicate content is no longer a potential hazard, it is still highly recommended to use mod rewrites to permanently 301 redirect traffic to one version of your site. This will eliminate the duplicate content concerns when pertaining to www issues. If you currently use a Google Webmasters account, we also recommend you select the preferred domain to have the site displayed the way you want. For more details on how to play it safe and consolidate www and non-www links here is a tutorial published exclusively at WilsonWeb.com and written by Ross Dunn.

In virtually all situations sites which are visible with both the www and non-www versions are not attempting anything fishy, so it only makes sense that Google should treat both versions as the same URL.

This morning Search Engine Land released details of a “robots-nocontent” tag that has just been adopted by Yahoo. The tag will allow site owners to block portions of a page from searches. This means that blocked content will still be indexed by Yahoo!’s search engine spider but it will not be among the searchable content at Yahoo.

What would be the purpose of such a robots-nocontent tag? Perhaps you have a few paragraphs of generic content duplicated across several pages of your website and you are suspicious the dupe content is hampering rankings. In this case you can now block that specific content from searches and test your assumptions. It remains to be seen how well this tag will work but it is always favorable to have more tools in the optimization tool chest.

Implementation
The new tag must be included as a “class” in these exact words “robots-nocontent” and can be included anywhere in the page. If wide areas need to be blocked then simply use DIV tags to encapsulate the content.

Additional notes:

  • At this time it is not known if any other search engines will follow suit.
  • Yahoo has provided official word of this launch on the Yahoo Blog.

I found this interesting post from “Dr. Pete” at SEOmoz.org discussing his experience rescuing a client’s website from the vastness of Google’s supplemental index. Pete provided a great deal of detail on how he succeeded with his particular client. In this case the client was definitely in a bad state beforehand where even the most basic SEO strategies were not in place. The most basic fixes he implemented consisted of creating unique Titles and Meta Description Tags which in my opinion would definitely reduce supplemental results. Read more…

Written by Ross Dunn, CEO, StepForth SEO Services

Credits: (Continued from Part 1)
The following is Part 2 of the coverage of the Search Engine Strategies (SES) New York presentation called “Mobile Search Optimization” by Cindy Krum of Blue Moon Works, Gregory Markel, President of Infuse Creative LLC and Rachel Pasqua, Director of Mobile Marketing at iCrossing. Read more…

Getting Google and other major search engines to spider your XML Sitemap just got a little bit easier. Although submitting through a search engine’s submission interface such as Google Webmaster Tools can offer additional valuable information, if you simply could not be bothered there is now an easier way.

Vanessa Fox posted in the Webmaster Central Blog a few recent announcements regarding sitemaps.org, including a point on XML sitemaps. Read more…

The following is coverage of the Search Engine Strategies (SES) New York presentation called “Mobile Search Optimization” by Cindy Krum of Blue Moon Works, Gregory Markel, President of Infuse Creative LLC and Rachel Pasqua, Director of Mobile Marketing at iCrossing.

This presentation provided a fascinating glimpse into the young realm of mobile site creation, compliance and optimization. I have a lot of information to work with here so to make this article a little more digestible I have broken it into two parts; one is the site creation and the second is the site optimization. Read more…

Okay, so I said it all in the title. Perhaps you need not even read this article as you may start off not believing it.

Today I read an article written by an unknown name in the SEO industry (at least unknown to me). It had a number of points focused on improving search rankings and provided a bunch of tips on how to improve the overall standings for a site. Sounds like a useful article, and for many who read it, I am sure it was – or at least seemed that way. Read more…

You have probably heard a ton about Social Bookmarking or Social Media Networking, or …. it goes on; as within anything on the Internet there are a myriad of different terms to describe this phenomenon. What it comes right down to though are votes. Similar to a backlink to a website, social bookmarks are a method for the average joe to share a great online find with the rest of the world. As others share their favorite finds the bookmarked content has a greater chance of generating more and more interest. Content that gets the most interest will earn prominent visibility that can earn hundreds and even thousands of free backlinks. As a result, making it easy for users to socially bookmark your content is definitely in your best interest. From my experience the most popular bookmarking websites are Digg it, Reddit, and del.icio.us. Read more…

Transferring traffic and popularity to a new domain is a painstaking process that no one on the web appears to be immune to, or so Topix.net has realized. Topix.net is a leading news aggregation resource that has been in the news lately because they are planning to move their site from Topix.net to Topix.com after purchasing the .com for a cool million from a Canadian animation company.

The Wall Street Journal wrote this article explaining how damaging a seemingly simple process of switching from .net to .com could be for Topix LLC. The author goes on to explain such a switch is usually fraught with ranking drops while the major search engines notice and respond to the changeover. The fact that switching addresses will cause problems is not news in the SEO world; however, I thought Topix.net’s situation was a great opportunity to review what one might expect when switching domains. Read more…