Request a Quotation
 

The search engine environment continues to evolve rapidly, easily outpacing the ability of consumers and SEO practitioners to quickly adapt to the new landscape. With the ascension of Inktomi to the level of importance that until recently was held solely by Google, SEO practitioners need to rethink several strategies, tactics and, perhaps even the ethics of technique. Assuming this debate will unfold over the coming months, how does an “ethical SEO firm” work to optimize websites for two remarkably unique search engines without falling back on old-fashioned spammy tactics of leader-pages or portal-sites? Recently, another SEO unrelated to StepForth told me that he was starting to re-optimize his websites to meet what he thought were Inktomi’s standards as a way of beating his competition to what looks to be the new main driver. That shouldn’t be necessary if you are careful and follow all the “best practices” developed over the years.

The answer to our puzzle is less than obvious but it lies in the typical behaviors of the two search tools. While there are a number of similarities between the two engines, most notably in behaviors of their spiders, there are also significant differences in the way each engine treats websites. For the most part, Google and Inktomi place the greatest weight on radically different site elements when determining eventual site placement. For Google, strong and relevant link-popularity is still one of the most important factors in achieving strong placements. For Inktomi, titles, meta tags and text are the most important factors in getting good rankings. Both engines consider the number and arrangement of keywords, incoming links, and the anchor text used in links (though Google puts far more weight on anchor text than Inktomi tends to). That seems to be where the similarities end and, the point where SEO tactics need revision. Once Inktomi is adopted as Yahoo’s main listing provider, both Google and Inktomi will drive relativity similar levels of search engine traffic. Each will be as important as the other with the caveat that Inktomi powers two of the big three while Google will only power itself.

2004 – The Year of the Spider-Monkey

The first important factor to think about is how does each spider work?

Entry to Inktomi does not mean full-indexing

Getting your site spidered by Inktomi’s bot “Slurp” is essential. Like “Google-bot”, “Slurp” will follow every link it comes across, reading and recording all information. A major difference between Google and Inktomi is that, when Google spiders a new site, there is a good chance of getting placements for an internal page without paying for that specific page to appear in the index. As far as we can tell, that inexpensive rule of thumb does not apply to Inktomi. While it is entirely possible to get entire sites indexed by Inktomi, we have yet to determine if Inktomi will allow all pages within a site to achieve placements without paying for these sites to appear in the search engine returns pages, (SERPs). Remember, Inktomi is a paid-inclusion service which charges webmasters an admission fee based on the number of pages in a site they wish to have spidered. From the information we have gathered, Slurp will follow each link in a site and, if provided a clear path, will spider every page in the site but, pages within that site that are paid-for during the submission will be spidered far more frequently and will appear in the indexes months before non-paid pages. We noted this when examining how many pages Inktomi lists from newer clients versus how many from old clients. We have noticed the older the site, the more pages appear in Inktomi’s database and on SERPs on search engines using the Inktomi database. (This is assuming the webmaster only paid for inclusion of their INDEX page) Based on Inktomi’s pricing, an average sized site of 50 pages could cost up to $1289 per year to have each page added to the paid-inclusion database so it is safer then not to assume that most small-business webmasters won’t want to pay that much.

Google’s Gonna Get You

Google-bot is like the Borg in Star Trek. If you exist on the web and have a link coming to your site from another site in Google’s index, Google-bot will find you and assimilate all your information. As the best known and most prolific spider on the web, Google-bot and its cousin Fresh-bot visit sites extremely frequently. This means that most websites with effective links will get into Google’s database without needing to manually submit the site. As Google currently does not have a paid-inclusion model, every page in a site can be expected to appear somewhere on Google produced SERPs. By providing a way of finding each page in the site (effective internal links), website designers should see their sites appearing in Google’s database within two months of publishing.

We Now Serve Two Masters; Google and Inktomi

OK, that said, how to optimize for both without risking placements at one over the other. The basic answer is to give each of them what they want. For almost a year, much of the SEO industry focused on linking strategies in order to please Google’s PageRank. Such heavy reliance on linking is likely one of the reasons Google re-ordered its algorithm in November. Relevant incoming links are still be extremely important but can no longer be considered the “clincher” strategy for our clients. Getting back to the basics of site optimization and remembering the lessons learned over the past 12-months should produce Top10 placements. SEOs and webmasters should spend a lot of time thinking about titles, tags and text as well as thinking about linking strategies (both internal and external). Keyword arrangement and densities are back on the table and need to be examined by SEOs and their clients as the new backbone of effective site optimization. While the addition of a text-based sitemap has always been considered an SEO Best Practice, it should now be considered an essential practice. The same goes for unique titles and tags on each page of a site. Another essential practice SEOs will have to start harping on is to only work with sites that have unique, original content. I am willing to bet that within 12-months, Inktomi introduces a rule against duplicate content as a means of controlling both the SEO industry and the affiliate marketing industry. Sites with duplicate content are either mirrors, portals or affiliates, none of which should be necessary for the hard-working SEO. While there are exceptional circumstances where duplicate content is needed, more often than not dupe-content is a waste of bandwidth and will impede a SEO campaign more than it would help.

The last tip for this article is, don’t be afraid to pass higher costs on to the clients because if your client wants those placements soon, paid-inclusion of internal pages will be expected. When one really examines the costs of paid inclusion it is not terribly different than other advertising costs, with one major exception. Most paid-advertising is regionally based (or is prohibitively expensive for smaller businesses). Search engine advertising is, by nature, international exposure and that is worth paying for.

Gravatar
Wednesday, January 21st, 2004

Google Email – Google going portal

Google Going Portal? That’s how it looks from the outside as Google announced it is thinking about introducing a Google Email feature along the lines of Yahoo, MSN and Lycos. Google has taken the concept a step further however and will likely use the new offering as a delivery vehicle for paid advertisements. This move might disappoint long-term Google users who have become accustomed to the clean interface that characterizes Google for many. Given the battle between Google, MSN and Yahoo however, it should come as no surprise that Google is looking to ensure brand-loyalty from its users.

Gravatar
Wednesday, January 21st, 2004

MSN/Yahoo Shift to Inktomi

For a short time last week, both MSN and Yahoo were displaying results drawn from Inktomi. MSN continues to use results directly from the Inktomi database but it appears that Yahoo has reverted back to results from Google, for the time being. Last Thursday (Jan. 15), MSN dropped results from LookSmart and went pure Inktomi. Yahoo, on the other hand, has announced that by the end of March, they too will have switched over completely from Google to Inktomi generated results. In the meantime, Yahoo seems to be experimenting with results from Inktomi by bleeding them in at different times and in different locations.

Gravatar
Wednesday, January 21st, 2004

Smaller, (2nd Tier) PPC Search Tools

Everyone knows about PPC Advertising on Google and Overture. But what about the smaller second tier engines like FindWhat, Kanoodle, GoClick, or any of the other dozens out there? Is it worth your time and money to bother with any of these small fish? Read more…

It is obvious the holidays are long over. While it is only the beginning of the third week of 2004, the ongoing battle between Google and Yahoo has heated up and is the most interesting subject in the search industry. Rumours about Google and Yahoo are abundant in the tech section of newspapers, in IT newsletters (the better ones anyway), daily articles and discussion forums. Behind those rumours stand literally tens of millions of hard-working people desperate to know which directions the industry will be going in over the next twelve months. Nobody wants another Christmas surprise like the one delivered by Google in 2003 and, given the sudden perception of volatility in the industry and the overall economy, nobody wants to make poor bets with their limited marketing budgets. As the gathering of as much information as possible allows advertisers, consumers and small businesses to make relativity informed decisions, it is in everyone’s best interests to share as much information as possible. The search engine world went through monumental changes last year and looks as if it will go through even more this year. With 2004 being labeled the “year of search”, a quick look at some of the anticipated changes is in order. Read more…

Gravatar
Thursday, January 15th, 2004

Improving Your Image on Google

Yesterday’s Toronto Star ran a story about the perils of being “Googled” before a romantic date. If there is any embarrassing information about you online and your real name is attached to it, chances are it can be found fairly easily in Google. As most people don’t realize, information your put on the Internet or that is put on the Internet about you can stick around for years after it is posted. You will have likely forgotten about it, but it is still there waiting for someone to find it… The same phenomena can effect job interviews, media relationships and your child’s perception of you as a responsible role-model. We at StepForth synopsize with people in this position as our names all all over the Internet and each of us get “Googled” fairly frequently in our interactions with potential clients and contacts. Read more…

The duplicated article was pulled from SEOinc.com, however, Google remembers:

The Google Cached Copy – Live Version

Now since this cache is not likely to stick around either, here is a downloadable version of the Google Cache in PDF format – Archive Version

2003 was a watershed year in the search engine industry. Not only was it the year of mergers and acquisitions, it was also the year that the media, business and financial sectors really took notice. It was a very busy year for the SEO sector as well, perhaps marking the maturity of the optimization industry. With so much action and so many changes, one almost required a scorecard to keep up. By the end of 2003 the search world looked remarkably different than it did at the beginning. Some firms were big winners while others were huge losers. Here’s our list to kick off the new year. Read more…

Link building is about to become a lot more important, and a lot more difficult if the current reigning theory/analysis of Google’s new algorithm is correct. Earlier this week, the CEO of India based SEORANK, Atul Gupta, published a brilliant analysis of Google’s new algorithm. We are experimenting with some of his findings but, for the most part, we think Gupta’s analysis is very accurate. To make a long analysis short, the article basically states that Google is now using two unique algorithms to measure the relevancy of incoming links to a website. Gupta theorizes that Google is just now introducing an algorithm known as Hilltop which appears to be blended with the Florida algorithm. The new algo measures incoming links in a very different way than the older PageRank algorithm did.

Gravatar
Wednesday, January 7th, 2004

END OF FREE LISTINGS?

With Yahoo!’s pending switch from Google (free-listings) to Inktomi (paid-inclusion), website owners with high Google rankings will see the number of visits to their sites drop dramatically as Yahoo! drives about 30 – 35% of all search traffic. Inktomi is also the primary supplier of results for MSN, the third largest search tool. We are predicting that the bulk of search traffic will come from the Inktomi database, starting sometime around March or April. This likely spells the demise of free-inclusion as Inktomi’s popularity will increase and Google will need to plug a sudden and likely massive revenue hole. While Google has traditionally spurned paid-inclusion, Yahoo!’s adoption of Inktomi results and the pressures stemming from going public through the anticipated IPO might move Google’s management towards the paid-inclusion spectrum.

Google Adwords Certified Partner Member of SEO Consultants Directory EMarketing Association