One of the most frequently asked questions readers and clients email StepForth Placement’s SEO staff, revolves around how websites can be best optimized to meet the algorithmic needs of each of the major 4 search engines, Google, Yahoo, MSN and Ask.

The more things change, the more they stay the same. Though there have been wide sweeping changes in the organic search engine landscape over the past six months, the fundamental ways search engines operate remains the same.

This question, or variants on it, reflects a shared notion among some webmasters that SEO driven placements at one search engine might come at the expense of high rankings across the other search engines. As the thinking goes, the techniques used to make a well optimized website rank well at Google might somehow prevent that same site from achieving high rankings at Yahoo, MSN and/or Ask. Alternately, webmasters and advertisers who already have great placements at Google but not at the others appear wary of sacrificing their Google rankings in pursuit of higher placements on Yahoo, MSN or Ask.

The differences between how each engine works appears to be causing a bit of confusion among webmasters and search marketers, especially regarding how to optimize well for all four at the same time.

Techniques that work on one engine might not work as well on another. In some extreme cases, techniques that work brilliantly with old school engines like MSN and Ask, and even with the invigorated Yahoo, are a kin to a kiss of death on Google.

There is one search engine friendly site design and optimization philosophy that works, almost every time, without fail. Good content, smart networking, and persistence over time. A well constructed website, or one that has been treated by a good search engine optimizer, should be able to rank well on all major search engines, provided that site has useful, relevant information to express.

Questions about ranking well on all four engines brings up some of the basic differences between the major search engines and, in light of so much change in the sector over the past few months, a look at what search engines look at, and how they do it seems in order.

There are a lot of differences between the major search engines but, by and large, they all gather information the same way. Each major search engine uses unique spider agents known as Googlebot, Slurp (Yahoo/Inktomi), Ask.com/Teoma, and MSNbot, (updated list @ Wikipedia), that find information by following links from document to document across the web. Spiders are designed to revisit sites on a semi-regular basis as well, though they often hit the index (or home) page more often than other pages. Spiders do tend to dig deeper looking for changes to internal documents based on changes to the index (or home) page. This allows the engines to maintain rapidly updating versions of the web, or parts of the web, in separate proprietary databases.

Each search database has its own characteristics and most importantly, each engine has its own algorithms for sorting and ranking web documents.

Getting information into those databases is the first stage of SEO. The site needs to be constructed (or reconstructed) in such a way as to allow search spiders to easily read and absorb the information and content contained on them.

Assuming realistic expectations and goal setting are already part of the equation, the success or failure of any multi-engine optimization campaign is dependent on the type of site being marketed, as much as it depends on methods and techniques used to market it. If the ultimate goal is strong search engine placements across all major search engines, a few compromises in style might be a temporary necessity in order to expose the great content and reap the rewards of multiple rankings.

Before beginning the building or construction of a site, having a working knowledge of the major on and off-site elements each search engine looks at when examining and evaluating a site and its contents is a key starting point.

There are two overarching areas all search engines examines when ranking a web document or site known as “on-page” and “off-page”. As their names indicate, search engines examine factors and elements that occur on the document or site in question as well as factors and elements occurring on other documents and sites related by links or by topical theme.

While the search algorithms of each engine might differ in the number of factors found on or off page and the overall importance of those factors, they all examine generally similar sets of data when deciding which should rank where in relation to whatever search-queries are entered.

For example, Google loves links, as does Yahoo, MSN and to a lesser degree, Ask. MSN and Ask are considered to be old school search engines, allowing simpler SEO techniques to work quite well, as they still do with Yahoo.

On-page factors are generally found in one of four areas, Titles, Tags, Text and Structure, while off-page elements tend to involve links, locality, search-user behaviours and the performance of competing sites.

Here is a thumbnail breakdown the most important factors each search engine considers, roughly laid-out in order of importance.

Google: Incoming Links, On-page SEO, Site Design Spiderability, User analytics, Outgoing links, Inclusion in other Google indexes, Document Histories

Yahoo: On-page SEO, Links and Link Patterns, Site Design, User analytics, Inclusion in other Yahoo indexes, Document Footprints

MSN: On-page SEO, Site Design and Structure and Sipderability

Ask: On-page SEO, Site Design, Site Structure and Spiderability

Because Google drives approximately 50% of all organic search traffic, SEOs, webmasters, and search advertisers tend to be most concerned with Google placements. When planning a search optimization campaign, whether for a new site or in the redevelopment of an existing site, building around Google’s needs is obviously the most logical path. It is also a smart way to find your way into the other search engines. Though each of the rival engines want to present the best possible results, Google’s algorithms account for quality scoring to a deeper degree than the others do. In other words, if your site meets Google’s various tests, it will likely meet those of the other engines.

Google puts an enormous weight on its evaluation of the network of links leading to and out from every web document in its index. Most, if not all, documents found in Google’s index got there because Google’s spider Googlebot found it by following an inbound link. Because its ranking algorithm is so heavily link dependent, Google is frequently forced to tinker with how it evaluates links, a process that generates a score known as PageRank. The basic wisdom on links says that incoming links from topically relevant sites are beneficial while those placed in order to get a better ranking at Google are not. Google also examines links on a document or site that are directed towards other sites in order to gauge if a webmaster is trying to game it or not by participating in link-networking schemes. To one degree or another, the three other major search engines do this as well, though MSN and Ask are not known for using link analysis as a weighty measure of site or document relevancy. Yahoo most certainly does. Link analysis is used to determine the seriousness and credibility of a web document by comparing it with other documents it is associated with.

Once a document exists in a search engine database, several on-page factors are examined. The engines tend to examine several elements of any particular document and the sites they are associated with including title, meta tags (in some cases), body text and other content, and internal site structure.

The key to providing search spiders with a strong on-page experience lies in presenting search spiders with a well designed, topically focused site. Again, remember the four basic on-page areas; titles, tags, text and site structure, creating documents that are friendly to all four search engines is not terribly difficult.

There are a few easy tips that should be kept in mind though. New websites should always introduce themselves to the search engines with very focused content expressed on a very basic site structure. Adding content as time goes forward is a much better way to feed search spiders than giving them a site that is already full of information. Search engines, especially Yahoo and Google, appreciate fresh content and can be “invited” back to a site again and again when new material is added.

Webmasters with pre-existing websites enjoying great rankings in one place but seeing sub-standard rankings in others should take a step back and re-evaluate the overall theme presented by the documents that make up their sites. In a technically perfect world, the most relevant and topical documents would reach the top of the rankings. As the search engines really are striving for a measure of technical perfection, ensuring your documents are tightly and topically focused is essential.

For those who have lost position at Google but not at the other search engines recently, check your link networks for undesirable connections. Good placement at MSN, Ask and Yahoo but sub-standard placement at Google is almost always a signal that some links going to or coming from your site have raised questions at Google. You should also check the content your site carries to be sure it is (as much as possible), original and not simply a copy of content found on other sites.

In the end, the best practices tend to win with the major search engines. A good website or document should be able to place well across all four engines at the same time, provided the webmaster or SEO specialist takes time to follow SEO best practices.