Gravatar
Thursday, June 7th, 2007

Summarized SEO Tips from SMX Advanced – Part 1

 

Today marks my first day back from a short road trip through Oregon and from Danny Sullivan’s inaugural SMX Advanced Expo in Seattle. Considering the conference was the first expo in Danny’s new conference series I would say it was a blazing success. Danny brought together an impressive gaggle of leading names from the search engines and search marketing industry including: Matt Cutts, Tim Meyers, Vanessa Fox, Amit Kumar, Todd Friesen, Bruce Clay, Neil Patel, Greg Boser, Christine Churchill, and Jennifer Slegg just to name a few. Now let me get to the meat of the matter… what can I and what can’t I share with you?

What I Can Share
There were two ‘tracks’ within this seminar series; paid and organic. I stuck to the organic track but even this covered a whole host of topics so please bear with me as I provide the first of a couple rundowns on the most interesting facts from the range of seminars and discussions I witnessed:

Feedback from Google, Yahoo, Ask and Microsoft on Duplicate Content

  • Eytan Seidman a lead program manager at Microsoft Live Search noted that site wide ‘penalties’ for duplication are extremely rare; mostly they are limited to obvious scraper sites and other abusers. In fact, Peter Linsley, Sr. Product Manager at Ask.com compared the downside to duplicate content as being “similar to not being crawled”. In other words duplicate content is merely invisible to search engines or in Google’s case just relegated to the supplemental index (which is the same thing really). Just the same, don’t interpret these casual observations as an excuse to create duplicate content; after all, the rules are constantly changing. What is okay today may not be tomorrow.

  • Amit Kumar, an Engineering Manager at Yahoo noted that Yahoo sees no problem with duplicate content in varying formats such as PDF, Word, Text, etc.
  • This should come as no surprise to StepForth readers, that the overall consensus from the search representatives was that 301’s and 302’s were the best method to redirect traffic and prevent duplicate content issues. For example, redirecting http://yourdomain.com to http://www.yourdomain.com is a smart move and a 301 redirect is the best method. I wrote a technical redirect tutorial on how to implement this and why. As an added note I can say with relative certainty that employing this redirect has increased the stability and position of rankings for StepForth clientele.

  • An audience member brought up an interesting situation where their resellers/affiliates had, upon approval, taken all of their content and reproduced it on their own sites. Now he has regretted his decision to allow the republication of his content. He wondered how he could take back the rights of his original content and still have his site benefit from this content without having to rewrite it all. Unfortunately the panel of search engine representatives seemed unanimous that rewriting the content and strictly copyrighting it (so that resellers couldn’t use it) was the best idea. My advice for an emerging business that plans to have an affiliate program is to forbid resellers/affiliates from using your live content yet provide them with a secondary site or set of content for them to use. This will dramatically minimize future duplicate content issues.

Blogs and Supplemental Results: I spoke with Vanessa Fox, the Product Manager at Google and I asked her how tolerant Google was with the inherent duplication within blog platforms (i.e. duplicates in archives, labels/categories, etc.) and whether users really need to block specific sections of their blogs to prevent duplicate content issues. Vanessa simply stated that Google does a very good job at picking the best copy of content within a blog and having pages within the supplemental index is nothing to worry about. To quote her (approximately) she said “Google will find the best version of your content, don’t worry about it.” In other words, blocking content within a blog is not necessary because Google will choose the best version to deliver to users.

From my perspective this does not mean that everyone should open up the flood gates and unblock sections of their blog willy nilly. As Neil Patel noted in part 1 of my Blogs 101 article, there is definitely a lot of benefit to providing search engines with only one complete copy of your article for search engines to index.

Other Points of Interest

  • Flash Optimizers Beware the NoEmbed Tag: I am awaiting further confirmation from Matt Cutts but it appears that the <noembed> tag (one of the perceived SEO techniques for Flash users) is not a reliable means to attain rankings since it can be so easily manipulated by spammers. This is not unexpected and I have never recommended relying on the noembed tag, however, I know of some who do use it and have succeeded in rankings – for those people this is a good warning to change tactics soon if not quickly. I will let you know when I receive more details from Matt.
  • Concerned About Personalization in Your Results? If you think you might be seeing results that are influenced by Google’s personalization algorithm then first ensure you are logged out of Google (see the top right of your Google page for login status). If you are still uncertain of your results then Matt Cutts revealed a string of code to add to any URL to ensure personalization is ‘turned off’: just add &pws=0 to the end of the URL and you are certain to have a non-personalized result.

What I Can’t Share With You
Seems an odd question doesn’t it? Let me explain. There was a segment of the show called “Give It Up!” where a huge panel of search engine marketers revealed some of their favorite search engine optimization and marketing secrets for everyone in the room to use at their discretion. There was one condition to learning these secrets; no one was allowed to blog or print these secrets for a total of 30 days from the date of the meeting. A whole host of great tips and ideas were given out and unfortunately I won’t be publishing that information until July 6th. Rest assured I won’t miss anything of importance when I do publish it then – stay tuned!

There is Much More to Come!
I have much more to share with you. I will publish my next report early next week hopefully along with some video clips from the show.

>> Download this article in PDF or Word

by Ross Dunn, CEO, StepForth SEO Services
Celebrating 10 Years of SEO Excellence


2 Responses to “Summarized SEO Tips from SMX Advanced – Part 1”

  1. Ashish Pawaskar

    Hi,

    As per your technical tutorial, I have this in my .htaccess:

    RewriteCond %{HTTP_HOST} !^www\.greatoffers4u\.com [NC]
    RewriteCond %{HTTP_HOST} !^$
    RewriteRule (.*) http://www.greatoffers4u.com/$1 [L,R=301]

    However, this is what happens when visiting a password protected page, instead of going to http://www.greatoffers4u.com/update it redirects to the 401 error page! Can you advise?

    [offers@web www]$ lynx -dump -head http://greatoffers4u.com/update
    HTTP/1.1 301 Moved Permanently
    Date: Thu, 21 Jun 2007 10:01:22 GMT
    Server: Apache/1.3.37 (Unix) PHP/5.2.1 mod_auth_passthrough/1.8 mod_log_bytes/1
    .2 mod_bwlimited/1.4 mod_ssl/2.8.28 OpenSSL/0.9.8a
    WWW-Authenticate: Basic realm=”Restricted Area”
    Location: http://www.greatoffers4u.com/401.shtml
    Connection: close
    Content-Type: text/html; charset=iso-8859-1

  2. Ross Dunn

    Hi Ashish,

    It looks as though you have done some extensive rewriting of the code I provided at http://www.stepforth.com/faq/non-www-redirect.htm

    What you should be doing is ONLY replacing “example.com” with your domain in the sample from the tutorial. The spacing, the carriage returns… everything else should be the same.

    If you have further problems let me know. That said, the code that is within the tutorial has been utilized extensively without error.

    Good luck!
    Ross

Leave a Reply

Google Adwords Certified Partner Member of SEO Consultants Directory EMarketing Association