Everyone loves Top10 lists. In the SEO industry, where search engine results form the ultimate Top10 lists for clients and practitioners, the sheer number of ways a website, document or other spiderable object can be designed makes it very difficult to produce a general Top10 list for best practices. There are however, a number of basic mistakes made by webmasters, site designers and new online-entrepreneurs that inadvertently create obstacles to search placement success.

In other words, while it is difficult to say exactly what one should do in any given circumstance, it is fairly simple to say what one shouldn’t do. In the spirit of the Friday before Halloween, here is a list of the Top13 worst things we’ve seen designers doing this year.

1) ISP Hosted Shopping Cart CMS Template Generated Websites

The worst and I mean absolute worst commercial websites to work on are designed using sub-standard shopping cart CMS generated templates. Often designed by micro-businesses new to the Internet, these sites tend to look as if they were created several years ago. In some cases, the templates they use were written in the last century.

These CMS systems rarely produce a quality product to represent your products. The advantage seen by new Internet businesses are often found in the “business in a box” solution these schemes offer by enabling website design with a shopping cart and credit-card processing. In the end, the business owner doesn’t even get what he or she paid for as monthly fees are often more expensive over time than the development and hosting of a professional website.

A quick caveat… There are some very well designed Shopping Cart CMS systems out there that have worked hard to take SEO concerns into consideration. These systems tend to be newer, having been developed and instituted in the last two years. ApplePie from RoseRock Design is one example.

2) Continued use of duplicate-template CMS systems by competitors in the same industry (especially prevalent in real estate industry)

Following on the points above, the use of duplicate content templates makes achieving search placements very difficult. While some content such as name, location and region-specific information might vary; the layout is almost identical as are the all too common shared information-includes. Facilitating the creation of multiple incidents of duplicate content, even if everyone else in the local sector is doing it is not a particularly good search engine placement technique.

3) Duplicate Content on Multiple Domains

Speaking of duplicate content, there is a second, more damaging type of “dupe-content” which continues to be practiced out there. Some webmasters purchase multiple domain names, often trying to protect the brand name on their original one but sometimes in attempt to manipulate search rankings. The problem is, they populate those domains with duplicate content. Like most of us, search engines rarely enjoy reading the same stuff twice. They are very unlikely to give good rankings to incidents of duplicate content. Nevertheless, dupe-content persists, often due to neglect more than intent.

A few years ago, someone, (most likely working in the domain registrar industry), said it would be a good idea to purchase the URL for every possible incident of your corporate name. The logic is sound as it prevents confusion if another company with a similar name opens somewhere in the world. Lots of companies followed this logic and needing something to put up on those domains, they simply replicated content found at their original domain. In the past year, we have seen more than one incident where a global-scale corporation replicated duplicate content across dozens of nation specific TDLs (.ca, .ie, .co.uk, etc…) The correct way to use a number of TDLs is to either produce content that is unique and relevant to visitors from the region each TDL represents, or to use the 301 (permanent) or 302 (temporary) redirect command to reference search spiders and site visitors to the site containing the original content.

In previous years, the words used to phrase a domain name had much more influence over organic search engine placements than they do today. That led to the purchase and proliferation of multi-word URLs networked together to blitz the Google algos. Mega-network promotions populated with URLs such as homes-in-walla-walla-washington.com and realestate-agents-walla-walla-washington.com and walla-walla-real-estate-homes.com were spawned and littered the web with duplicate content. Some of that duplicate content remains in use. In one extreme case, we saw what had been a duplicate-content site sold to a new and obviously cyber-naive real estate agent.

4) Leader Pages – Doorway Pages – Customized Ranking Pages

Every search engine uses a slightly different algorithm. Google’s is heavily influenced by incoming links but considers an array of on-site/page elements. Yahoo is also influenced by incoming links but also considers a wider array of on-site/page factors as well. MSN is very influenced by on-site/page factors. The other search engines have their own unique ranking tendencies. Therefore, a version of each important page in a site should be designed to achieve high rankings on each search engine, based on the unique ranking method used by each search engine. On the surface, that thinking makes sense. With a dozen or so mini-sites, a link density network could be crafted to please Google’s link-dependent algorithm. The method became a primary tool of the early SEO industry when there were eight different search engines to think about.

Search engines implemented filters to remove leader or doorway pages, a task made easier after the dot-com crash of 2000 when Google rose to be the only major algorithmic search engine. When Yahoo and MSN introduced their own proprietary search engines in 2004, a number of less than ethical SEO firms began using leader or doorway pages again, sometimes culminating in disastrous results as seen last year when two of the largest SEO spam-shops in the United States got entire client lists banned from Google’s database.

5) Link-Network Schemes

The client-list bans mentioned under the previous heading happened for a couple of reasons. The first was duplicate content spread across a number of leader or doorway pages. The second was that these doorway pages had been connected together to form a massive link-network designed to game Google’s page-rank dependent algorithms.

Link-networks as ranking schemes have sprung up over the course of the ten-year history of the SEO sector. The premise and sales pitch is relatively simple. Practically everyone familiar with Google’s ranking formulas understands the importance of incoming links. What most people don’t fully understand is the sophistication that goes into the way Google and the other search engines judge the value and relevancy of links. The major search engines need to have very stringent link-evaluation tools as pieces of their ranking algorithms. Every search engine using links to recommend new sites for inclusion in their databases stresses the importance of topical relevance between linking documents. In other words, information on documents that are linked together should have a direct or even indirect relationship.

Both Google and Yahoo consider the content found on documents that link to each other before assigning a value to each link. They also consider the age, intent and context of each linking document. In order to prevent bogus link-networks from being established, both claim to and attempt to consider the entire link-tree surrounding a URL before placing a value on links found within or directed to it.

6) Hidden text

Hidden text describes a technique as simple as the name implies. By placing keyword loaded text in places search engine spiders will see it but live-visitors won’t, ill-informed webmasters and SEOs are trying to achieve high placements through higher keyword to non-keyword ratios or densities. Hidden text often takes the form of poorly camouflaged place names and services tagged to the bottom of documents.

Sometimes hidden text takes the form of white-text on a white-background. Other times it can be found in comment tags included in the source-code. A more sophisticated way to hide text is to place it behind a [div] layer. However one tries to hide hidden text, search engines always see through the trick and will tend to apply penalties against sites using it.

Part Two of: “SEO FrightSites :: Top Thirteen Worst Website / Search Issues seen in 2K5” will be published on Monday