Today marks my first day back from a short road trip through Oregon and from Danny Sullivan’s inaugural SMX Advanced Expo in Seattle. Considering the conference was the first expo in Danny’s new conference series I would say it was a blazing success. Danny brought together an impressive gaggle of leading names from the search engines and search marketing industry including: Matt Cutts, Tim Meyers, Vanessa Fox, Amit Kumar, Todd Friesen, Bruce Clay, Neil Patel, Greg Boser, Christine Churchill, and Jennifer Slegg just to name a few. Now let me get to the meat of the matter… what can I and what can’t I share with you? Read more…
Blogs 101 is a resource to provide our clients and readers with a clear concept of what a blog is, why a blog might be a positive addition to their website or marketing campaign, and how to implement, optimize and promote a blog. In Part 1 of this series I discussed the basics of a blog and some of the necessary steps to take before starting one. In Part 2, I explained blog feeds and how to optimize a blog. Now in Part 3, I will explain social media marketing and outline a selection of strategies for socially marketing your blog. Read more…
Blogs 101 is a resource to provide our clients and readers with a clear concept of what a blog is, why a blog might be a positive addition to their website or marketing campaign, and how to implement, optimize and promote a blog. In Part 1 of this series I discussed the basics of a blog and some of the necessary steps to take before starting one. In Part 2, I will explain blog feeds and how to optimize a blog. Read more…
StepForth’s Blogs 101 is a resource to provide our clients and readers with a clear concept of what a blog is, why a blog might be a positive addition to their website or marketing campaign, and how to implement, optimize and promote a blog. In Part 1 of this series I will discuss the basics of a blog and some of the necessary steps to take before starting one. Read more…
This morning Search Engine Land released details of a “robots-nocontent” tag that has just been adopted by Yahoo. The tag will allow site owners to block portions of a page from searches. This means that blocked content will still be indexed by Yahoo!’s search engine spider but it will not be among the searchable content at Yahoo.
What would be the purpose of such a robots-nocontent tag? Perhaps you have a few paragraphs of generic content duplicated across several pages of your website and you are suspicious the dupe content is hampering rankings. In this case you can now block that specific content from searches and test your assumptions. It remains to be seen how well this tag will work but it is always favorable to have more tools in the optimization tool chest.
The new tag must be included as a “class” in these exact words “robots-nocontent” and can be included anywhere in the page. If wide areas need to be blocked then simply use DIV tags to encapsulate the content.
- At this time it is not known if any other search engines will follow suit.
- Yahoo has provided official word of this launch on the Yahoo Blog.
In reference to my article “Mobile Search Site Creation and Optimization – Part 1” Vance Hedderal, Director of Public Relations at .mobi explains why he thinks the .mobi extension should be used instead of a mobile subdomain (i.e. yoursite.mobi vs. mobile.yoursite.com) Read more…
Here is an excerpt from Liz’s synopsis:
“If you’re planning to use AJAX on your site, or if your web site already contains AJAX, you’ll need to take some extra steps to protect your natural rankings in major search engines. As long as you follow a few guidelines, you can make AJAX work without any impact on your SEO. But if you don’t follow these guidelines, your search rank can suffer.”
I highly doubt FLASH and AJAX will always pose such a barrier to search engines but for now and the near future you will need to utilize this technology carefully to allow search engines to access content on your site.
I found this interesting post from “Dr. Pete” at SEOmoz.org discussing his experience rescuing a client’s website from the vastness of Google’s supplemental index. Pete provided a great deal of detail on how he succeeded with his particular client. In this case the client was definitely in a bad state beforehand where even the most basic SEO strategies were not in place. The most basic fixes he implemented consisted of creating unique Titles and Meta Description Tags which in my opinion would definitely reduce supplemental results. Read more…
At the 2007 SES New York, Google’s Shuman Ghosemajumder responds to the question: (abbreviated) “How is it possible for Google to identify click fraud when an aggressor utilizes rotating proxies?”. Shuman responds by discussing the Clickbot A botnet case and how Google deciphered the click fraud in that situation. This video was taken during the “Auditing Paid Listings and Click Fraud Issues” seminar that took place on April 12, 2007.
This video is courtesy of the StepForth SEO Blog. Video taken by Ross Dunn, CEO of StepForth SEO Services. Special thanks to Matt McGowan of Incisive Media for allowing StepForth to record this footage.