So you have just opened the doors to your new online business. You website is still too new to generate any traffic from the search engines. So while you are waiting for your site to be spidered and indexed, what can you do to start driving customers through the doors? Read more…

The search engine marketing environment is undergoing enormous changes as the industry and the Internet mature. The changes we’ve seen over the past six months are only the tip of a much larger iceberg as new technologies and innovations on established ideas radically alter how we look for information and how that information is delivered to us by search engines. For marketers and advertisers in the traditional world of print and broadcast media, change happens slowly, generally with enough warning to allow for long-term decisions and campaign planning. Read more…

With all these changes surrounding Yahoo and the new Overture Site Match system, where do you stand to get the best bang for you buck as a small business?

First a quick glance at the new Site Match system. It is based on a combination of cost-per-click and paid inclusion. For $49 you can submit the first page to Site Match, for subsequent pages the cost is reduced. Read more…

In the battle for search engine placement across all organic listings, that golden spot of number one is top priority, and for good reason. That may not hold true in the realm of pay per click advertising. What many people may not know, is that the number one spot, although it typically grabs the highest click through’s, may not always be the most cost effective. Here’s why. Read more…

Thursday, February 19th, 2004

Changes in submission rules at Yahoo

We received this email this morning. Still looking into it to see what costs will be…

Dear Priority Submit customer.

This is an important notification in relation to the following services:

1. Inktomi Search Submit

2. AltaVista Express Submit

3. Fast PartnerSite

4. Summary

1. Inktomi Search Submit

Yahoo! Search has transitioned to its own search technology and is preparing to launch a new inclusion program. Read more…

The balance of organic and CPC advertising is totally subjective depending on the needs and goals of the advertiser. The following discusses the two most common scenarios for determining a balance of online marketing. Read more…

Thursday, February 19th, 2004

Yahoo meta tag and title rules

Yahoo has introduced it’s own search engine! Here’s a few rules to follow to keep Yahoo happy… Read more…

Knowing where your domain name has been is very important when constructing a search engine placement campaign. Most people are pretty sure they know what they have done with their domain name but what most people don’t consider is the fact they may not be the first owners of that name. Often webmasters find themselves locked out of major search engines for no reason whatsoever. Read more…

Wednesday, February 4th, 2004

Searchers using more complex terms

Search engine users are starting to use more complex search terms, often adding two or three extra words to the traditional 2-keyword phrase. There are multiple reasons for the increasing complexity of search terms, including the continued rapid growth of the Internet and difficulties finding relevant information on the first page of many search engine results pages. Another reason could be that recent changes at Google have frustrated Google users, thus forcing them to be more descriptive in their queries. Webmasters building sites and SEOs working on sites should consider targeting 3 – 5 keyword phrases as well as 2-keyword phrases. When preparing the copy for your website, think about the various phrases searchers might look for your site and find a way to integrate these phrases in the body text of the site’s index page and important internal pages. Chances are, your site will see more visitors once it starts catering to searchers who use more than 2 words at a time.

The search engine environment continues to evolve rapidly, easily outpacing the ability of consumers and SEO practitioners to quickly adapt to the new landscape. With the ascension of Inktomi to the level of importance that until recently was held solely by Google, SEO practitioners need to rethink several strategies, tactics and, perhaps even the ethics of technique. Assuming this debate will unfold over the coming months, how does an “ethical SEO firm” work to optimize websites for two remarkably unique search engines without falling back on old-fashioned spammy tactics of leader-pages or portal-sites? Recently, another SEO unrelated to StepForth told me that he was starting to re-optimize his websites to meet what he thought were Inktomi’s standards as a way of beating his competition to what looks to be the new main driver. That shouldn’t be necessary if you are careful and follow all the “best practices” developed over the years.

The answer to our puzzle is less than obvious but it lies in the typical behaviors of the two search tools. While there are a number of similarities between the two engines, most notably in behaviors of their spiders, there are also significant differences in the way each engine treats websites. For the most part, Google and Inktomi place the greatest weight on radically different site elements when determining eventual site placement. For Google, strong and relevant link-popularity is still one of the most important factors in achieving strong placements. For Inktomi, titles, meta tags and text are the most important factors in getting good rankings. Both engines consider the number and arrangement of keywords, incoming links, and the anchor text used in links (though Google puts far more weight on anchor text than Inktomi tends to). That seems to be where the similarities end and, the point where SEO tactics need revision. Once Inktomi is adopted as Yahoo’s main listing provider, both Google and Inktomi will drive relativity similar levels of search engine traffic. Each will be as important as the other with the caveat that Inktomi powers two of the big three while Google will only power itself.

2004 – The Year of the Spider-Monkey

The first important factor to think about is how does each spider work?

Entry to Inktomi does not mean full-indexing

Getting your site spidered by Inktomi’s bot “Slurp” is essential. Like “Google-bot”, “Slurp” will follow every link it comes across, reading and recording all information. A major difference between Google and Inktomi is that, when Google spiders a new site, there is a good chance of getting placements for an internal page without paying for that specific page to appear in the index. As far as we can tell, that inexpensive rule of thumb does not apply to Inktomi. While it is entirely possible to get entire sites indexed by Inktomi, we have yet to determine if Inktomi will allow all pages within a site to achieve placements without paying for these sites to appear in the search engine returns pages, (SERPs). Remember, Inktomi is a paid-inclusion service which charges webmasters an admission fee based on the number of pages in a site they wish to have spidered. From the information we have gathered, Slurp will follow each link in a site and, if provided a clear path, will spider every page in the site but, pages within that site that are paid-for during the submission will be spidered far more frequently and will appear in the indexes months before non-paid pages. We noted this when examining how many pages Inktomi lists from newer clients versus how many from old clients. We have noticed the older the site, the more pages appear in Inktomi’s database and on SERPs on search engines using the Inktomi database. (This is assuming the webmaster only paid for inclusion of their INDEX page) Based on Inktomi’s pricing, an average sized site of 50 pages could cost up to $1289 per year to have each page added to the paid-inclusion database so it is safer then not to assume that most small-business webmasters won’t want to pay that much.

Google’s Gonna Get You

Google-bot is like the Borg in Star Trek. If you exist on the web and have a link coming to your site from another site in Google’s index, Google-bot will find you and assimilate all your information. As the best known and most prolific spider on the web, Google-bot and its cousin Fresh-bot visit sites extremely frequently. This means that most websites with effective links will get into Google’s database without needing to manually submit the site. As Google currently does not have a paid-inclusion model, every page in a site can be expected to appear somewhere on Google produced SERPs. By providing a way of finding each page in the site (effective internal links), website designers should see their sites appearing in Google’s database within two months of publishing.

We Now Serve Two Masters; Google and Inktomi

OK, that said, how to optimize for both without risking placements at one over the other. The basic answer is to give each of them what they want. For almost a year, much of the SEO industry focused on linking strategies in order to please Google’s PageRank. Such heavy reliance on linking is likely one of the reasons Google re-ordered its algorithm in November. Relevant incoming links are still be extremely important but can no longer be considered the “clincher” strategy for our clients. Getting back to the basics of site optimization and remembering the lessons learned over the past 12-months should produce Top10 placements. SEOs and webmasters should spend a lot of time thinking about titles, tags and text as well as thinking about linking strategies (both internal and external). Keyword arrangement and densities are back on the table and need to be examined by SEOs and their clients as the new backbone of effective site optimization. While the addition of a text-based sitemap has always been considered an SEO Best Practice, it should now be considered an essential practice. The same goes for unique titles and tags on each page of a site. Another essential practice SEOs will have to start harping on is to only work with sites that have unique, original content. I am willing to bet that within 12-months, Inktomi introduces a rule against duplicate content as a means of controlling both the SEO industry and the affiliate marketing industry. Sites with duplicate content are either mirrors, portals or affiliates, none of which should be necessary for the hard-working SEO. While there are exceptional circumstances where duplicate content is needed, more often than not dupe-content is a waste of bandwidth and will impede a SEO campaign more than it would help.

The last tip for this article is, don’t be afraid to pass higher costs on to the clients because if your client wants those placements soon, paid-inclusion of internal pages will be expected. When one really examines the costs of paid inclusion it is not terribly different than other advertising costs, with one major exception. Most paid-advertising is regionally based (or is prohibitively expensive for smaller businesses). Search engine advertising is, by nature, international exposure and that is worth paying for.