It has been just over ten years since the public release of the first modern search engine, AltaVista. Infoseek, Lycos, HotBot, Northern Lights, and dozens of others quickly followed AV. In previous years search marketing was much simpler. Since then we’ve seen the rise and fall of several search firms. Almost as many ideas on how the business of search should be conducted have come and gone in the same period of time. While search has grown far more sophisticated in its first decade, the free (or Organic) listings have been a constant common factor for every search engine. Read more…
Much has changed since last year in the world of search engine marketing. These changes have widened the knowledge gaps between the SEM sector, our clients and the general public. A knowledge gap separating professional experience and general interest is natural in any industry as a quick peak under any newer model car hood will demonstrate.
In a field as user-dependent and re-evolutionary as the search industry, knowledge gaps can lead to expensive chaos for consumers, advertisers and webmasters. Many common assumptions about search engine marketing have been made obsolete or require a different way of thinking. Many erroneous assumptions continue to be proliferated in hundreds of forum posts, emails and marketing articles. Read more…
Over the past year, Blogs have been used to manipulate search engine rankings in a very big way. Couple the immense power of link-distribution inherent in the Blogosphere with Google’s way of ranking websites based on the number and relevancy of incoming links and add a number of SEOs with overactive imaginations. The result is a spamming machine of mythic proportions.
Remember the SEO competitions of last year when the nonsense phrase: “Nigritude Ultramarine”? Well, if you don’t, suffice it to say it was a contest to see who could get and keep #1 placement under a phrase that was at the time, totally fresh as it wasn’t a real phrase to begin with. The results proved the power of Blogs and link-densities. Now Google, Yahoo, MSN and others have joined together to support a new link-attribute that attempts to remove the temptation of creating links artificially through blogs and comments.
The new attribute is called “nofollow” and is designed to be placed within a hyperlink anchor.
What is the nofollow tag and what does it do?
For instance, the link: [a href=http://www.isedb.com/]Search Industry News[/a] will allow a spider to pass PageRank from your site to theirs because it is a link; this is the standard way any link is treated without a nofollow.
A similar link, [a href="http://www.isedb.com/" rel="nofollow"]Search Industry News[/a] tells the spider you are linking to a website you do not want to be associated with so you do not want to pass along any of your valuable PageRank. The attribute can also be placed in front of the URL in the href string.
Google says it will not count links with the nofollow attribute in PageRank scores and will not count the anchor text in terms of relevancy to the page linked to. This should effectively remove the benefits of link-spamming in forums and blogs. Even so, the overactive imaginations found under dark-hats in the sector are already working on work-arounds. It will be interesting to see how this new tag works out.
What would you do if you were tasked with designing a new search engine?
You have all the resources the world can offer and the certain knowledge that your project is so important to your employer that mountains, molehills, companies, code and really comfy office chairs will be moved, built or acquired to meet your needs, no questions asked. Your boss demands a product that is better than best and, having failed to notice how overwhelmingly essential search would become back when he came to dominate everything else, appears ready to back your project with missionary zeal and Machiavellian maneuvering. The cold hard truth is, the future of one of the largest corporations in the world, owned incidentally by the world’s wealthiest man, may well rest on your shoulders. In this scenario, there are no obstacles, only the challenge of beating Google at Google’s best game. Whoa…. Read more…
How to write for Slurp the spider
As the world’s second most popular search tool, Yahoo moves a tremendous amount of traffic and is a very credible alternative to Google. Yahoo receives over 2.76 billion page views per day from hundreds of millions of unique users. It boasts over 157 million registered users enjoying mail, shopping and discussion groups and an increasingly personalized search and news services. For the past two years, Yahoo, Google and MSN have been embroiled in a hard-fought battle for the loyalty of search engine users forcing all three firms into the hyper-evolution we are witnessing today. Over the next three Wednesdays we are going to examine how the Big-3 spiders work, what they look for and how to best prepare your sites for multiple visits from the bots that rank them. Today, we are starting with Yahoo’s bot, SLURP. Read more…
Commercial websites are getting larger. Driven by the rapid evolution of content management systems, shopping carts and e-biz facilitation, and by the increasing sophistication of Internet retailers, “small” business sites averaging 500+ pages have become common.
Some large sites are very well focused and present relatively few problems for SEOs. Most larger sites however list a wide array of products, services and information. The optimization of large retail sites presents multiple issues for SEOs to work through. Achieving product-specific placements for sites featuring numerous products is much more difficult than achieving placements for smaller, more focused sites. Fortunately, good SEOs are good problem solvers and almost every technical problem has a solution. Read more…
“You don’t drown by falling in the water; you drown by staying there.” (Edwin Louis Cole)
What does the timing of Columbus Day (US), Thanksgiving Weekend (Canada), baseball playoffs, and corn roasts typically signify? For many of us it is the beginning of a process to focus and formulate our company’s priorities for the New Year. Specifically, businesses that operate on a calendar fiscal year start about this time to prepare their new budget. Ideally we take extra time to analyze the current expenses and revenues. We dust off our microscope and try to objectively evaluate how each account has performed. How do they fit in with our short and long-term goals? What would you do differently? Read more…
There has been a great deal of speculation about a shift coming in Google’s ranking algorithms over the past few weeks. Several recent pieces of evidence point to a coming shift however it is very difficult to predict what, if anything will happen. It’s pretty much a given that there will be some form of update to the way Google reads and ranks sites. Google actually makes minor adjustments on a regular basis. There are rare occasions however when major changes are introduced. I suspect this is one of those times. Read more…
Once upon a time in a search engine long ago, content was king and little else mattered. Then along came the links in an attempt overthrew the king – now everything is messed.
Back links are very important in placing well in the SERP’s especially for highly competitive keywords, however content is still the king, and without it your sites are as good as lost! So what is a site owner to do when you’ve expressed and said all there is to say and are left with only a 5 page website? Read more…
Q. Where can I find the ranking of search engines?
A. Some of the latest stats on search engine user frequencies are done by Nielsen NetRatings. In his July 14 article “Neilson NetRatings Search Engine Ratings” Danny Sullivan, Editor of SearchEngineWatch discusses the June 2004 results. To read the report, please click here.
Q. Is it possible to “infect” a search engine?
A. As Jim Hedger, StepForth Senior SEO, reported in a July 28 news article Google, Yahoo, Lycos and AltaVista went weird, evidently. “They were all temporarily offline across much of the globe on Monday following a massive direct assault from the MyDoom.O worm virus. Effectively creating a denial of service (DOS) attack, MyDoom.O prevented search results from being displayed across most of Canada, the United States, UK, Europe and Asia.
Q. I am trying to find the Alexa address that will show how often a particular search term is used.
A. Owned by Amazon.com, Alexa uses the Google index for its searches. Alexa offers information on site traffic and links. You can find data on such topics as traffic ranking, links to related sites and back link statistics. It also allows for keyword searching. Check out their free toolbar. You can find it at: www.alexa.com
Q. If I were to use paid per click advertising, how much would these terms cost me per month and approximately how many hits could I expect? Are the number of times these phrases are searched by the search engines in the paid per click price?
A. The answers you seek are too lengthy for the scope of this column. Scott Van Achte, StepForth SEO wrote two articles last December and January called “PCP for Dummies Part 1″ and “PCP for Dummies Part 2″. I highly recommend them. Here are the links: · http://news.stepforth.com/2003-news/ppc-for-dummies-part1.shtml http://news.stepforth.com/2004-news/ppc-for-dummies-part2.shtml
Q. My understanding is that keyword suggestion tools provide only a listing of the number of times someone has come to that particular address and searched for that term. Do they indicate if there is any connection to the number of times a phrase is searched on all search engines?
A. Keyword suggestion tools vary in the manner of how and where they collect information. They can provide results on how often keywords are used as search terms. It is important to note these tools may utilize a particular search engine. As can be expected several programs use either Google or Yahoo. The end result is called ‘search term popularity’ or ‘keyword effectiveness index’.
When researching keywords there are three main factors that must be considered:
· the number of searches for each phrase,
· the targeted nature of a specific phrase, and
· the competition for that phrase online.
The number of searches will indicate the amount of traffic you will get from top placement. Generally speaking, any phrase with more than 100 – 150 searches per day is considered relatively highly searched. That said one must also consider how targeted a phrase is. An untargeted or general phrase with 200 searches per day may be less valuable that a targeted phrase with only 30 searches per day. Armed with this information we must then look at the competition. If a phrase with 150 searches per day has a very high competition level but a phrase with only 10 searches per day has a low competition level, the less competitive phrase may produce a better return on investment.
Q. My current site is written in MS FrontPage, has keywords repeating too often on each page, and the pages are not as relevant as they could be. Now, I have an entirely new site with the pages designed in Dreamweaver. I can’t afford to spend too much money, but need to get this done quickly. What would be the best way to take the text from my current web pages and rewrite each page, using correct SEO technique not repeating keywords too often, and as well, keeping the pages strictly on topic?
A. As you seem to be familiar with web design, hire an SEO company on a consultancy basis to assist you in optimizing your website. Find an SEO company with a solid track record, is respected within the industry, has longevity, and has a published code of ethics. (To view StepForth’s SEO Code of Ethics please click on http://www.stepforth.com/company/ethics.html)