Request a Quotation
 
Gravatar
Thursday, August 10th, 2006

Google XML Sitemaps – The Basics

Google XML Sitemaps have been around for a while now and many webmasters are starting to become familiar with them. They can help you to achieve up to date indexing in Google, and, in a round about way, play a small roll in assisting with rankings. Sitemaps are not needed by everyone, but can be of significant use for many website. This article will touch on the basics of what they are, who can use them, and how to implement them.

What is a Google XML Sitemap?
In short a Google XML Sitemap allows webmasters to submit a master list of all their site’s pages to Google for indexing. This information is stored in an XML file along with other relevant information where specified by the webmaster. It can be as simple as a list of URL’s belonging to the site, or can include, last modified date, update frequency, and priority. The purpose of this Sitemap is to have the most recent version of your URL’s indexed in Google at all times.

Who needs a Google XML Sitemap?
XML sitemaps can generally help any site needing to be indexed by Google; however, small sites may not see the need for this. For example, if you have a small 10 page website that seldom sees any of its pages updated and your entire site is already in Google’s index, the XML Sitemap is not necessarily going to help much. It is best used when trying to keep the latest versions of your pages current in Google. Large sites with an extensive list of URL’s will also benefit, especially if 100% of their pages are not appearing in the index. So a general rule of thumb, if you have either a dynamic or large site, Google XML Sitemaps just may benefit you.

Will using XML Sitemaps improve my Google Ranking?
In most cases this will not improve your rankings, however it can help. By having the most current version of your site in Google’s index, this can speed up your movement in the results pages. This is because if you make an update to a page for optimization purposes, Google’s index will have this page updated more quickly than without the XML sitemap. What this essentially means is that with more frequent spidering you can help influence what version of your site is in the index, and ultimately, help with rankings by decreasing response time.

How do you create the XML Sitemap?
If you have a very small site, or a lot of time on your hands you can create your XML sitemap manually, but for the vast majority of webmasters, automated tools are an absolute must. There are a number of available solutions for this. One of the simplest methods of creating XML sitemaps is through the use of VIGOS GSitemap This is a free, easy to use tool that will help you create your XML sitemaps with ease. There are also number of downloadable and online tools listed on Google’s site which cater to both beginners and seasoned professionals alike.

Submitting your XML Sitemap to Google is relatively straightforward. After the file has been created the first thing you want to do is upload the file to your server, preferably at the root level. Log into the Sitemap console using your Google account login. From here you can add a site to your account. Simply enter your top level domain where it says “Add Site” (see fig 1.0). This will add the domain to your account and allow you to then submit the XML sitemap

(Figure 1.0)

After this is done it will take you to a screen with the summary for this site. You will see a text link that says “Submit a Sitemap”.

Clicking here will take you to a screen to enter the online location of the XML sitemap. (see fig 1.1). Click “Add Web Sitemap” and you are on your way.

(Figure 1.1)

Once this is complete you have the option of verifying your Sitemap. This can be done by placing a specific meta tag on your home page, or by uploading a blank html file with a file name provided by Google. Verification will allow you to access crawl stats, and other valuable information regarding your Google listing.

Below is a basic example of an XML Sitemap.

<?xml version=”1.0″ encoding=”UTF-8″?>

<urlset xmlns=”http://www.google.com/schemas/sitemap/0.84″
xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance” xsi:schemaLocation=”http://www.google.com/schemas/sitemap/0.84
http://www.google.com/schemas/sitemap/0.84/ sitemap.xsd”>

<url>
<loc>http://www.stepforth.com/</loc>
<lastmod>2006-08-09T04:46:26+00:00</lastmod>
<changefreq>Weekly</changefreq>
<priority>1.0</priority>
</url>


<url>
<loc>http://www.stepforth.com/company/contact.html</loc>
<lastmod>2006-08-08T04:46:26+00:00</lastmod>
<changefreq>Never</changefreq>
<priority>0.5</priority>
</url>

</urlset>

Implementing an XML Sitemap is generally straightforward and worth the effort. Taking the time to implement them is well worth it as there is no negative down side to this tool provided by Google. Every little thing adds up in terms of obtaining site rankings and frequent spidering by Google is certainly one of them.

Thank you to those that sent in their SEO questions from the last few search engine newsletters. I would also like to again request that you take a moment and submit your own questions; it is paramount that I answer the questions that are important to you. Just email me your question, it is that simple! Read more…

This is a short posting but it has the potential to have a wonderous impact on your website.

Back in June Google implemented a Robots.txt validator that allows you to check for errors. The robots.txt file can easily ruin your chances for ranking if it is not carefully created and if you did make a mistake then a simple fix here might turn your life around! Read more…

It is our policy at StepForth to try and help out anyone who calls and is searching for a service that we either do not provide or we cannot offer at the level they are requesting. To this end I would like to put out a request for companies with proven multi-language content development services and foreign SEO experience.

The prospect that contacted us is looking for rankings within the major European markets as well as Japan. Native language content creation will be required for each country.

Reply by Creating a Comment for this Posting: If you represent a qualified company please create a comment for this posting: the prospect in question is going to be keeping tabs on this posting and will reply if there is interest. Please DO NOT contact StepForth directly.

Please provide at least:

  • Your company name,
  • Languages you can optimize for,
  • Your website address and a link to your proven results,
  • Finally please note your name and number so that the prospect can contact you directly.

—————————-
SEO RFP Request posted by StepForth Search Engine Placement Inc.

A few short months ago Microsoft quietly introduced adCenter, their Pay Per Click (PPC) advertising platform. My first impressions of adCenter are relatively positive. Based mostly on the setup process here are my thoughts on the newest player in the PPC industry.
Signing up for a new account is quite simple. There is a $5 sign up charge though, so take note, you won’t get a look at the inner workings unless you are willing to spend a couple bucks. Years ago when I first looked at Google AdWords, I loved the fact that you could create an account, and go in and play with everything and look around. You didn’t pay a dime until you were ready to have your ads go live, that is when the setup fee of $5US or $10CDN (not sure where this exchange rate came from?) was charged.

Over the years we have answered many a phone call where a prospective client is frustrated with the poor performance of their website(s). Obviously the business profile for each website is unique, but common to many of conversations is the following scenario that we would like to share with you. For the purpose of this example we will use a site owner by the name of Sam.

The question from Sam (the caller):
“ My site is 2 years old and well established online; it is well indexed on the major search engines and has a couple rankings here and there. Unfortunately after so many years of work on the site the rankings and traffic for the website are still not where I need them. I am just not sure if I should commit my limited budget to reworking a website that has been so difficult to promote; it is becoming a bigger money hole than my boat. Should I continue to invest in my current website or try launching another site with slightly different content and a new domain?” Read more…

We have all been there, “how the heck do they always get #1?” It is a constant frustration for many a client and, well, even myself occasionally. The fact is that much of the time there are a few solid reasons behind the search engine success of any website and it is important to learn what these reasons are before trying to compete. How is this done? Therein lies the subject of this article; how do you determine what your competitor has done to win the search engine war?

Demystifying your competitor’s success requires you to put on your detective garb because you are going to have to investigate all aspects of their website; even the deepest darkest corners. In the following instructional I will lead you through a hypothetical investigation of a competitor who is ranking for the phrase “voip services”. In each step I will choose the more popular result that I find when I do similar competitor analyses professionally. So please take note, the sample is only the most popular result; occasionally there are truly baffling cases of competitor success which have required heftier investigation leading to differing conclusions of what they did to succeed. Read more…

A few weeks back MSN instituted a new Meta Tag that would allow website owners to force MSN to ignore their Open Directory Project (ODP) directory title and description when creating their search engine listing. For many this was a huge boon because their rankings were tied to the out of date, poorly edited or even false directory listings at the ODP. Read more…

Gravatar
Wednesday, July 5th, 2006

Google Adds to Your SEO Toolbox

Have you heard of Google?

Google shareholders must dance up a storm when they realize how ridiculous it has become to ask a question like this. Google has become an unavoidable fact of life for anyone with the intention of having their website found online. It is the ‘big daddy’ of search engines with very nearly 50% of the search market in its pocket (49.1% this May to be exact – Nielsen/Netratings). For this reason alone I get pumped up when I am given new opportunities to peek into Google’s treasure trove of stats. Today I would like to share with you a relatively new offering from Google’s lab; Google Trends.

Free website reviewNow, to many SEO’s there is nothing new here because we are all fanatical about keeping up on Google’s offerings; in our fast-paced world this was released decades ago in SEO time (kind of like dog years). But to clients, this may very well be an interesting new tool in their marketing arsenal and I bet that many of your competitors haven’t grasped the potential that this tool has for increasing your bottom line. So take heed and you just might find that piece of marketing information you have always wanted. Read more…

A few weeks ago, StepForth’s sales manager, Bill Stroll, took a well deserved holiday. That gave me the opportunity to sit in his chair for a few days, monitoring emails from clients and queries from potential clients. My primary focus was to answer client questions and respond to information requests that simply couldn’t wait until Bill was back at his desk.

Sitting in front of his computer gave me a chance to take another look a random sampling of websites interested in SEO. From time to time, I tabbed over to see some of the site-review questionnaire responses our system had recently handled.

Search engine optimization is obviously becoming more popular. We’re handling a lot more review requests. Many of the sites processed by our review system were already well-designed sites ready for optimization. Many others however, were simply not up to a standard of design or topical clarity in which our SEO services would help. It’s a hard thing to tell someone but someone has to do it, the website needs to be redesigned.

Online competition has increased dramatically year after year. Today there are more websites doing business in every commercial sector than there were yesterday. Though the search engines are better able to sort information and rank relevant sites against keyword queries, achieving search engine placements for smaller sites has gotten more difficult as the environment evolves.

Recent changes to the ranking algorithms at Google and Yahoo place increased importance on site user experience, making people-friendly web design an important component in SEO. Because the search engines want to gauge the usefulness of any given document or link, they track the movement of visitors from one web document to another. When larger numbers of visitors travel to a site and spend more time gathering information and following its links, the search engines tend to think favorably about that site. Similarly, when visitors click in and quickly click out, their leaving is noted and the action is somewhat scored against the site. It’s nothing personal, its just technology judging technology.

When a website is somehow unprepared to meet the standards we believe search spiders or human visitors are looking for, we call it not-ready-for-primetime. It’s a much gentler term to use than others we’ve tossed about. Not-ready-for-primetime sites come in all shapes, sizes and represent all sorts of businesses. The one thing they have in common is that, in their current condition, their chances of achieving strong search rankings are dim. They are often constructed as if they were afterthoughts, as brochures by people focused squarely on doing business in the real world.

When we come across sites that are not quite ready for primetime, we tend to recommend site redesign. The problem with recommending redesign as a pre-requisite of SEO work is that it needs to be factored into a preset marketing budget. Often, site owners are unable or unwilling to invest in site redesign and either go seeking help or affirmation from other search marketing shops, or give up altogether.

The easiest way to avoid presenting an unfriendly face to search engine spiders is to start from the basics and work your way up. Here are a few quick tips on spotting elements of your website that might not be as search engine friendly as they could or should be.

Every website, good or bad begins with a site structure. Some structures are better for search spiders than others. There are two areas to consider when thinking about site structure, regardless of the eventual size of the site. The first how the overall site file and directory tree will look. The second is how the first few levels of that tree will be laid out.

The overall site should be structured to allow for long-term growth. As more files or documents are added to the site, the designer will want to ensure that search spiders will find those files without too much trouble. That means limiting the number of directory levels as much as practically possible.

The first few levels of a site are extremely important for strong search rankings. Documents or files found on the first few pages of any site tend to contain the most unique, descriptive content. These documents are most likely to receive incoming links from other sites and are most likely to be entrance points to specific product or services offered on pages found deeper in the site. Establishing easily followed paths for search spiders and for live-visitors is important.

The next thing that makes a site not-ready-for-primetime is topical focus and clarity of message. In a competitive search engine environment, choosing a theme and sticking to it is generally good advice.

We often see sites that try to sell hundreds of unrelated consumer items or travel services. These types of sites pose two problems. First, there is no overall theme to think about when determining keywords to target. Secondly, much of the content on sites like this is lifted from other online sources, likely already existing in Google’s index.

If these sites were to segment their sites into niches or facets of the industries they are trying to represent and build a number of sites dedicated to those facets, chances are their sites would perform much better on the search engines.

Another series of elements that can make a site not-ready-for-primetime is found in previous attempts at search engine or network marketing. A reality of web-based business is that a little information can be extremely dangerous if applied incorrectly. We often come across sites that have joined web-rings or link-exchanges, or have remnants of spammy SEO techniques left over from a previous run-in with less ethical SEOs. We tend to see these sites just after they have been delisted or have seen their rankings degrade over time.

A site redesign is a serious commitment. Once it is undertaken, a whole range of planning, copywriting and meetings are in order. This process is often good for an online business as it forces the business to focus on how it conducts business online, and how to make that business better.

Perhaps the truest measure of the need to redesign a website comes not from the needs of website marketers but from the experience of the website owners themselves. Is the site producing revenues or attracting business of some form or another? Is it capable of returning some if not all of the money invested in it? If not, the best search engine placements on the Internet are not going to be much use.

The need for search friendly design is obvious, the demand is real.