Google Panda 4.1 Launches; To avoid liability, Google limits German news content to headlines. Plus, Ross and John offer Commentary about how Keywords are DEAD and a Moz story discusses Google Answer Boxes Now Showing Up 60 percent More Often

…read more

 

This September 4th SEO 101 podcast is available now: listen or download (right click and save). You can also find all past shows on iTunesTune in to SEO 101 LIVE every Wednesday at 2pm PST / 4pm EST on www.WebmasterRadio.FM

According to several studies from Kissmetrics and Linkedin the average person will abandon your website if it takes more than 3 seconds to load on their mobile device. They state that Google recommends website load times of one second or less on mobile devices.

What is the reason your website might be loading slower than other websites?

Why does a small site with less information load slower than a larger site with more information?

To understand, you must understand the critical rendering path. Google defines the critical path as the “code and resources required to render the initial view of a web page”.  A webpage is rendered when it can be seen by a user.  The rendering path of your content can be optimized and if you haven’t done it yet, it is something you needed to do yesterday. Read more…

Preface: only follow this if you are in 1999. After all, I suppose it is possible things could turn out entirely differently by 2014; who knows… InfoSeek could come back and rule once more (after Disney screws it up).

The year is 1999. The New Year (and potential Armageddon) is quickly approaching. You just built your first website and are hoping to get some sweet traffic from AltaVista. This new guy on the block, Google, has some potential, but I doubt he’ll make it; just another basement start up with limited potential. AltaVista will always be king. You want your share of traffic from those 250 million internet users out there – this number is sure to double in the next 20 years and you need to capitalize on this. In this article I am going to teach you how to get your website to dominate the search rankings in the new millennium. Read more…

  • Ross and John discuss the Final Word on the Death of Google Authorship; SearchMetrics 2013 Ranking Factors – Correlation via Data; John Mueller says Penguin Still to Come 3.0 Most Likely in 2014; Google Webmaster Tools API Updated.

…read more

 

This September 18th SEO 101 podcast is available now: listen or download (right click and save). You can also find all past shows on iTunes. Tune in to SEO 101 LIVE every Thursday at 1pm PST / 4pm EST on www.WebmasterRadio.FM

  • Ross offers things to know on how Google Authorship is dead but not author rank. Plus, some research on PNG and JPG compression, the Difference between Responsive Website Design and Adaptive Website Design, Negative SEO extortion and more.

…read more

This September 4thth SEO 101 podcast is available now: listen or download (right click and save). You can also find all past shows on iTunesTune in to SEO 101 LIVE every Thursday at 1pm PST / 4pm EST on www.WebmasterRadio.FM

When StepForth performs a website audit we look at many different aspects of a websites health ranging from onsite factors such as suitable content and site architecture, offsite factors like social media and inbound links, and various performance metrics like bounce rate and crawl errors.

The primary goal of an audit is to uncover any underlying issues that may prevent a site from achieving maximum visibility. A website does not need to be performing poorly in order to benefit from an audit, even strong sites are often missing some pieces that could improve performance once corrected.

Sometimes interesting findings occur accidently when there is an overlap of unique problems occurring on a website. When completing an audit recently a combination of high bounce rates and soft 404’s, along with proper ecommerce tracking, provided clues that a site may be seeing a leak in revenue that could be relatively easy to patch. Read more…

sub-domain-zombie-hunting.minLast week I wrote an article for The SEM Post on the hazards of zombie sub-domains and how your very own site could be affected without you being the wiser. I also used an example of Houzz.com which is the 177th most trafficked website in the USA yet it is suffering from zombie sub-domains; it can truly happen to anyone.

What follows is the introduction and a link to the full article here:

Google dislikes spending precious resources indexing web content on your site that is of no consequence to itself or its users. Inconsequential web content can be present in the form of duplicated pages and irrelevant or repetitive and/or thin text. Even worse, it is a waste of Google’s crawl budget on your site and such content will delay the indexing of legitimate content. In this case, however, the offense is greater than usual because we are dealing with unlimited duplicate versions of a website triggered by improperly installed DNS wildcards; I none to fondly call them zombie sub-domains.

DNS wildcards are used in a variety of ways but in this case I am focusing on the desirable redirect from a non-existent or miss-typed sub-domain to the appropriate URL (e.g. the root of the TLD).

Please continue reading my article “Why Google Dislikes Zombie Sub-Domains” at its official home on The SEM Post.

Google Webmaster Tools Notifications For Faulty Redirects and New Color-Coded Fetch As Google; Google Penalties Reappearing in Manual Spam Action of Webmaster Tools; Google Responds The Lack of Unannounced Penguin Updates; Low quality guest blogging is considered little or no original content by Google.

…read more

 

This August 14th SEO 101 podcast is available now: listen or download (right click and save). You can also find all past shows on iTunesTune in to SEO 101 LIVE every Thursday at 1pm PST / 4pm EST on www.WebmasterRadio.FM