Request a Quotation
 

Repeatedly my sales and consulting staff find themselves explaining that using duplicate content can and will negatively affect search engine rankings and it is heartbreaking to see clients having to rebuild rankings due to such a simple mistake. As a result, I felt it was time to write this article and hopefully dispel many misled website owners.

Why write an entire article on something as simple as duplicate content? Well probably because it is not as simple as it sounds and many website owners find themselves in the grey area of duplication; where they don’t know for sure whether they are risking rankings or not. Read more…

As an SEO I am asked a number of questions covering a broad range of SEO related topics and one question in particular is asked quite often. This question holds answers which, when ignored, could see a once well ranked website spiral into depths of the search engine rankings forever.

“I am in the process of redesigning my site, what should I look out for in
order to maintain the SEO (and rankings)?” Read more…

When I sit down with new clients and discuss the status of their new or existing site they are often shocked when I am forced to inform them that their site is not search engine friendly. Encountered with a blank but slightly shaken look I then explain that this means their site has a particular problem that is hindering search engine rankings. Often this is represented by an inflexible design, overuse of advanced web technologies, or simply a weak navigation scheme. As a result, if they were to continue with the site as it stands they are unlikely to attain competitive search engine rankings. Read more…

Gravatar
Thursday, August 10th, 2006

Google XML Sitemaps – The Basics

Google XML Sitemaps have been around for a while now and many webmasters are starting to become familiar with them. They can help you to achieve up to date indexing in Google, and, in a round about way, play a small roll in assisting with rankings. Sitemaps are not needed by everyone, but can be of significant use for many website. This article will touch on the basics of what they are, who can use them, and how to implement them.

What is a Google XML Sitemap?
In short a Google XML Sitemap allows webmasters to submit a master list of all their site’s pages to Google for indexing. This information is stored in an XML file along with other relevant information where specified by the webmaster. It can be as simple as a list of URL’s belonging to the site, or can include, last modified date, update frequency, and priority. The purpose of this Sitemap is to have the most recent version of your URL’s indexed in Google at all times.

Who needs a Google XML Sitemap?
XML sitemaps can generally help any site needing to be indexed by Google; however, small sites may not see the need for this. For example, if you have a small 10 page website that seldom sees any of its pages updated and your entire site is already in Google’s index, the XML Sitemap is not necessarily going to help much. It is best used when trying to keep the latest versions of your pages current in Google. Large sites with an extensive list of URL’s will also benefit, especially if 100% of their pages are not appearing in the index. So a general rule of thumb, if you have either a dynamic or large site, Google XML Sitemaps just may benefit you.

Will using XML Sitemaps improve my Google Ranking?
In most cases this will not improve your rankings, however it can help. By having the most current version of your site in Google’s index, this can speed up your movement in the results pages. This is because if you make an update to a page for optimization purposes, Google’s index will have this page updated more quickly than without the XML sitemap. What this essentially means is that with more frequent spidering you can help influence what version of your site is in the index, and ultimately, help with rankings by decreasing response time.

How do you create the XML Sitemap?
If you have a very small site, or a lot of time on your hands you can create your XML sitemap manually, but for the vast majority of webmasters, automated tools are an absolute must. There are a number of available solutions for this. One of the simplest methods of creating XML sitemaps is through the use of VIGOS GSitemap This is a free, easy to use tool that will help you create your XML sitemaps with ease. There are also number of downloadable and online tools listed on Google’s site which cater to both beginners and seasoned professionals alike.

Submitting your XML Sitemap to Google is relatively straightforward. After the file has been created the first thing you want to do is upload the file to your server, preferably at the root level. Log into the Sitemap console using your Google account login. From here you can add a site to your account. Simply enter your top level domain where it says “Add Site” (see fig 1.0). This will add the domain to your account and allow you to then submit the XML sitemap

(Figure 1.0)

After this is done it will take you to a screen with the summary for this site. You will see a text link that says “Submit a Sitemap”.

Clicking here will take you to a screen to enter the online location of the XML sitemap. (see fig 1.1). Click “Add Web Sitemap” and you are on your way.

(Figure 1.1)

Once this is complete you have the option of verifying your Sitemap. This can be done by placing a specific meta tag on your home page, or by uploading a blank html file with a file name provided by Google. Verification will allow you to access crawl stats, and other valuable information regarding your Google listing.

Below is a basic example of an XML Sitemap.

<?xml version=”1.0″ encoding=”UTF-8″?>

<urlset xmlns=”http://www.google.com/schemas/sitemap/0.84″
xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance” xsi:schemaLocation=”http://www.google.com/schemas/sitemap/0.84
http://www.google.com/schemas/sitemap/0.84/ sitemap.xsd”>

<url>
<loc>http://www.stepforth.com/</loc>
<lastmod>2006-08-09T04:46:26+00:00</lastmod>
<changefreq>Weekly</changefreq>
<priority>1.0</priority>
</url>


<url>
<loc>http://www.stepforth.com/company/contact.html</loc>
<lastmod>2006-08-08T04:46:26+00:00</lastmod>
<changefreq>Never</changefreq>
<priority>0.5</priority>
</url>

</urlset>

Implementing an XML Sitemap is generally straightforward and worth the effort. Taking the time to implement them is well worth it as there is no negative down side to this tool provided by Google. Every little thing adds up in terms of obtaining site rankings and frequent spidering by Google is certainly one of them.

We have all been there, “how the heck do they always get #1?” It is a constant frustration for many a client and, well, even myself occasionally. The fact is that much of the time there are a few solid reasons behind the search engine success of any website and it is important to learn what these reasons are before trying to compete. How is this done? Therein lies the subject of this article; how do you determine what your competitor has done to win the search engine war?

Demystifying your competitor’s success requires you to put on your detective garb because you are going to have to investigate all aspects of their website; even the deepest darkest corners. In the following instructional I will lead you through a hypothetical investigation of a competitor who is ranking for the phrase “voip services”. In each step I will choose the more popular result that I find when I do similar competitor analyses professionally. So please take note, the sample is only the most popular result; occasionally there are truly baffling cases of competitor success which have required heftier investigation leading to differing conclusions of what they did to succeed. Read more…

A few weeks ago, StepForth’s sales manager, Bill Stroll, took a well deserved holiday. That gave me the opportunity to sit in his chair for a few days, monitoring emails from clients and queries from potential clients. My primary focus was to answer client questions and respond to information requests that simply couldn’t wait until Bill was back at his desk.

Sitting in front of his computer gave me a chance to take another look a random sampling of websites interested in SEO. From time to time, I tabbed over to see some of the site-review questionnaire responses our system had recently handled.

Search engine optimization is obviously becoming more popular. We’re handling a lot more review requests. Many of the sites processed by our review system were already well-designed sites ready for optimization. Many others however, were simply not up to a standard of design or topical clarity in which our SEO services would help. It’s a hard thing to tell someone but someone has to do it, the website needs to be redesigned.

Online competition has increased dramatically year after year. Today there are more websites doing business in every commercial sector than there were yesterday. Though the search engines are better able to sort information and rank relevant sites against keyword queries, achieving search engine placements for smaller sites has gotten more difficult as the environment evolves.

Recent changes to the ranking algorithms at Google and Yahoo place increased importance on site user experience, making people-friendly web design an important component in SEO. Because the search engines want to gauge the usefulness of any given document or link, they track the movement of visitors from one web document to another. When larger numbers of visitors travel to a site and spend more time gathering information and following its links, the search engines tend to think favorably about that site. Similarly, when visitors click in and quickly click out, their leaving is noted and the action is somewhat scored against the site. It’s nothing personal, its just technology judging technology.

When a website is somehow unprepared to meet the standards we believe search spiders or human visitors are looking for, we call it not-ready-for-primetime. It’s a much gentler term to use than others we’ve tossed about. Not-ready-for-primetime sites come in all shapes, sizes and represent all sorts of businesses. The one thing they have in common is that, in their current condition, their chances of achieving strong search rankings are dim. They are often constructed as if they were afterthoughts, as brochures by people focused squarely on doing business in the real world.

When we come across sites that are not quite ready for primetime, we tend to recommend site redesign. The problem with recommending redesign as a pre-requisite of SEO work is that it needs to be factored into a preset marketing budget. Often, site owners are unable or unwilling to invest in site redesign and either go seeking help or affirmation from other search marketing shops, or give up altogether.

The easiest way to avoid presenting an unfriendly face to search engine spiders is to start from the basics and work your way up. Here are a few quick tips on spotting elements of your website that might not be as search engine friendly as they could or should be.

Every website, good or bad begins with a site structure. Some structures are better for search spiders than others. There are two areas to consider when thinking about site structure, regardless of the eventual size of the site. The first how the overall site file and directory tree will look. The second is how the first few levels of that tree will be laid out.

The overall site should be structured to allow for long-term growth. As more files or documents are added to the site, the designer will want to ensure that search spiders will find those files without too much trouble. That means limiting the number of directory levels as much as practically possible.

The first few levels of a site are extremely important for strong search rankings. Documents or files found on the first few pages of any site tend to contain the most unique, descriptive content. These documents are most likely to receive incoming links from other sites and are most likely to be entrance points to specific product or services offered on pages found deeper in the site. Establishing easily followed paths for search spiders and for live-visitors is important.

The next thing that makes a site not-ready-for-primetime is topical focus and clarity of message. In a competitive search engine environment, choosing a theme and sticking to it is generally good advice.

We often see sites that try to sell hundreds of unrelated consumer items or travel services. These types of sites pose two problems. First, there is no overall theme to think about when determining keywords to target. Secondly, much of the content on sites like this is lifted from other online sources, likely already existing in Google’s index.

If these sites were to segment their sites into niches or facets of the industries they are trying to represent and build a number of sites dedicated to those facets, chances are their sites would perform much better on the search engines.

Another series of elements that can make a site not-ready-for-primetime is found in previous attempts at search engine or network marketing. A reality of web-based business is that a little information can be extremely dangerous if applied incorrectly. We often come across sites that have joined web-rings or link-exchanges, or have remnants of spammy SEO techniques left over from a previous run-in with less ethical SEOs. We tend to see these sites just after they have been delisted or have seen their rankings degrade over time.

A site redesign is a serious commitment. Once it is undertaken, a whole range of planning, copywriting and meetings are in order. This process is often good for an online business as it forces the business to focus on how it conducts business online, and how to make that business better.

Perhaps the truest measure of the need to redesign a website comes not from the needs of website marketers but from the experience of the website owners themselves. Is the site producing revenues or attracting business of some form or another? Is it capable of returning some if not all of the money invested in it? If not, the best search engine placements on the Internet are not going to be much use.

The need for search friendly design is obvious, the demand is real.

One of the most frequently asked questions readers and clients email StepForth Placement’s SEO staff, revolves around how websites can be best optimized to meet the algorithmic needs of each of the major 4 search engines, Google, Yahoo, MSN and Ask.

The more things change, the more they stay the same. Though there have been wide sweeping changes in the organic search engine landscape over the past six months, the fundamental ways search engines operate remains the same. Read more…

Gravatar
Wednesday, May 10th, 2006

Yahoo Search Marketing Handbook

Mona Elesseily has nearly five years experience crafting Yahoo Search Marketing (YSM) campaigns for her clients but over the last 16-months, she has been absolutely immersed. As the Yahoo! (Overture) PPC expert working at Canadian paid search advertising firm Page Zero Media, Mona has enjoyed a unique opportunity to learn, explore and understand the inner workings of Yahoo’s premier PPC program. In the autumn of 2004, she was “challenged” by her employer, Andrew Goodman, to write a book explaining the intricacies of establishing and managing a successful YSM PPC campaign.

The result of her work is the Yahoo Search Marketing Handbook, a 102-page manual that starts with the basics, outlines potential pitfalls and gradually guides the reader towards establishing and maintaining winning YSM PPC campaigns. Read more…

Gravatar
Tuesday, January 17th, 2006

15 Shades of SEO Spam

Spam, in almost any form, is somehow bad for your health. The vast majority of web users would agree with that statement and nobody would even think of the finely processed luncheon meat-product made by Hormel. Even the word itself is infectious in all the worst ways, being used to describe the dark-side and often deceptive side of everything from Email marketing to abusive forum behaviour. In the search engine optimization field, Spam is used to describe techniques and tactics thought to be banned by search engines or to be unethical business practices. Read more…

Gravatar
Wednesday, January 11th, 2006

SEO Tips In a Sea of Change

Advanced SEO 2006

Waves of change have cascaded over the search marketing sector in the past year prompting changes in the methods, business and practice of search engine optimization. Though many things have been altered, expanded or otherwise modified, the general search engine market share has not. Google remains the most popular search engine and continues to drive more traffic than the other search engines combined. Another thing that has not changed is the greater volume of site traffic generated by organic search placement over any other form on online advertising. Read more…

Google Adwords Certified Partner Member of SEO Consultants Directory EMarketing Association