When I sit down with new clients and discuss the status of their new or existing site they are often shocked when I am forced to inform them that their site is not search engine friendly. Encountered with a blank but slightly shaken look I then explain that this means their site has a particular problem that is hindering search engine rankings. Often this is represented by an inflexible design, overuse of advanced web technologies, or simply a weak navigation scheme. As a result, if they were to continue with the site as it stands they are unlikely to attain competitive search engine rankings. Read more…
Google XML Sitemaps have been around for a while now and many webmasters are starting to become familiar with them. They can help you to achieve up to date indexing in Google, and, in a round about way, play a small roll in assisting with rankings. Sitemaps are not needed by everyone, but can be of significant use for many website. This article will touch on the basics of what they are, who can use them, and how to implement them.
What is a Google XML Sitemap?
In short a Google XML Sitemap allows webmasters to submit a master list of all their site’s pages to Google for indexing. This information is stored in an XML file along with other relevant information where specified by the webmaster. It can be as simple as a list of URL’s belonging to the site, or can include, last modified date, update frequency, and priority. The purpose of this Sitemap is to have the most recent version of your URL’s indexed in Google at all times.
Who needs a Google XML Sitemap?
XML sitemaps can generally help any site needing to be indexed by Google; however, small sites may not see the need for this. For example, if you have a small 10 page website that seldom sees any of its pages updated and your entire site is already in Google’s index, the XML Sitemap is not necessarily going to help much. It is best used when trying to keep the latest versions of your pages current in Google. Large sites with an extensive list of URL’s will also benefit, especially if 100% of their pages are not appearing in the index. So a general rule of thumb, if you have either a dynamic or large site, Google XML Sitemaps just may benefit you.
Will using XML Sitemaps improve my Google Ranking?
In most cases this will not improve your rankings, however it can help. By having the most current version of your site in Google’s index, this can speed up your movement in the results pages. This is because if you make an update to a page for optimization purposes, Google’s index will have this page updated more quickly than without the XML sitemap. What this essentially means is that with more frequent spidering you can help influence what version of your site is in the index, and ultimately, help with rankings by decreasing response time.
How do you create the XML Sitemap?
If you have a very small site, or a lot of time on your hands you can create your XML sitemap manually, but for the vast majority of webmasters, automated tools are an absolute must. There are a number of available solutions for this. One of the simplest methods of creating XML sitemaps is through the use of VIGOS GSitemap This is a free, easy to use tool that will help you create your XML sitemaps with ease. There are also number of downloadable and online tools listed on Google’s site which cater to both beginners and seasoned professionals alike.
Submitting your XML Sitemap to Google is relatively straightforward. After the file has been created the first thing you want to do is upload the file to your server, preferably at the root level. Log into the Sitemap console using your Google account login. From here you can add a site to your account. Simply enter your top level domain where it says “Add Site” (see fig 1.0). This will add the domain to your account and allow you to then submit the XML sitemap
After this is done it will take you to a screen with the summary for this site. You will see a text link that says “Submit a Sitemap”.
Clicking here will take you to a screen to enter the online location of the XML sitemap. (see fig 1.1). Click “Add Web Sitemap” and you are on your way.
Once this is complete you have the option of verifying your Sitemap. This can be done by placing a specific meta tag on your home page, or by uploading a blank html file with a file name provided by Google. Verification will allow you to access crawl stats, and other valuable information regarding your Google listing.
Below is a basic example of an XML Sitemap.
|<?xml version=”1.0″ encoding=”UTF-8″?>
Implementing an XML Sitemap is generally straightforward and worth the effort. Taking the time to implement them is well worth it as there is no negative down side to this tool provided by Google. Every little thing adds up in terms of obtaining site rankings and frequent spidering by Google is certainly one of them.
No, it is not a STD but to some they are worse. After all, stats, better known as statistics, conjure up bad memories for many who unhappily sat through this post-secondary statistics course. Hey, let’s face it; pouring over statistics is not for everyone.
Oh Yeah, That Thing!
All too often my clients say ‘huh?’ when I ask them if they have access to web statistics or traffic reports for their website(s). But the fact is that if you have a website you more than likely have access to traffic ‘stats’ that provide a glimpse into the minds of your website visitors. Simply put, I think there are some readers out there that could use a brief tutorial on just how useful those website statistics reports can be.
To Start: Get Access to Your Statistics
If you have statistics for your website open them up and check them out while or after you read this article; you just might learn something about your site that you never knew.
If you are unsure whether you have statistics just give your web hosting company a call and ask them. Chances are that you were provided access to this information when you signed up for your service and you just need to get the proper access information again. Don’t worry if you are concerned about complexity, normally getting access to your website statistics is a totally painless process.
If you don’t have access to a report right now, try using this sample I found online.
A Mini Analytics Glossary
Before you delve into the world of analytics I want to provide you with a brief outline of the more popular analytics terms that you will often encounter. Please note that these are all very basic definitions for the purposes of this article.
“Unique Visitors” or “Visitors”: the number of different physical people that visited your site. This statistic is far from perfect (curious? read here for why) but it is used often.
Visits: each new visitor that has not been to your site within the last 60 minutes.
Page Views: generally known as the most accurate statistic, this information tells you how many times your pages were viewed in total.
Hits: never use this term! This term was used ages ago interchangeably (read that confusingly) to describe visitors when in fact it meant something entirely different. Here is an extreme example; a site with 1,000,000 hits per month may only have 1000 visitors per month if the only page they visit has 1000 images on it. In other words, a hit is registered every time an image loads. As you can imagine, this statistic can be very misleading.
Here are some more terms defined.
Exactly What Are Website Statistics?
Website statistics, more appropriately known as website analytics are defined as “the measurement of visitor behaviour on a website.” (see full definition of “website analytics”). In other words, the analytics are meant to provide you with insight into how a visitor reacts within your website so that you can improve your site; ultimately improving their experience and your return on investment.
How Can Website Analytics Help Me?
Analytics can provide you with information about your website that you may never have even considered possible. I find this subject simplest to explain by example:
|Identify Killer Pages
One of the most common statistics that is found in any analytics program is ‘Top Exit Pages”. If you have seen an analytics report before then you have likely come across this statistic. Now what? Exit pages are an extremely revealing statistic because it shows, from the worst offender downwards, which pages are influencing your visitors to leave your website.
Consider this for a moment. A report that the majority of website owners have access to may actually provide solid clues as to why their website is not making them more money; but many of them have never sent these reports.
How do can you act on this information? Well that depends on the complexity of your analytics software. For the simplest of analytics solutions you will not have much more to go on other than a specific page is driving away potential business and it must be fixed! In that scenario it is best for you to examine the page closely and search for anything that might be driving away business. The possibilities are limitless but here are some common offenders:
> Low content relevancy: if your page is supposed to have tips on how to look after their snowboard and you have only 2 tips then this maybe disappointing your viewers who were expecting a better resource.
> Poor usability: can your viewers find what they are looking for? If they have to look hard, they will likely leave. Make it as simple as possible for them to navigate your site and ultimately purchase your product/service. Contact me for more information on website usability.
>The visitor is directionless: have you crafted your pages so that the next step in your visitor’s progress is clear? The content on your website is the key to your online success; if you leave your visitor without a clear path they may leave looking for a better site.
|Sometimes Clients Arrive at the Back Door
More often than not there are a number of pages within a website where visitors enter; it isn’t only the home page. The “Top Entry Pages” report will outline which pages are being used as entry pages. These statistics can mean huge dividends for just about any website but especially ecommerce sites. For example, if you know that your “snowboard care” page is an entry point you may want to place your list of specials (usually reserved for the home page) on that page in order to entice viewers to check out your inventory.
|Who, When, Where?
Who? = Who are your viewers? Most analytics programs will show you the geography of your visitors. This information will give you a better idea of which countries, and in some cases cities, seem to be providing the most visitors.
When? = On average what time of day are your visits the highest? This isn’t generally the most useful statistic for the basic user but it can be a huge benefit if you are in a pay-per-click campaign; dayparting allows you to specify the time of day you want your ad to be shown and this feature is available in some PPC programs.
Where? = By far one of the most useful statistics! Where did your visitors come from? The Referrer Report is available in 100% of analytics reports. It provides you with a valuable list of the websites that are sending you traffic. Just take a moment and imagine the possibilities that this report can provide.
> Discover which search engines are providing you with the most traffic. You may be surprised when a lesser known search engine provides a decent amount of traffic. Knowing this information you can have your SEO review your rankings on that search engine and improve them where necessary.
> Discover which websites are providing you the most traffic. You might find a website that you are promoting on is providing you with more traffic than you had dreamed. In this case it would be a no-brainer to enhance your promotions on that website and potentially make a killing in additional sales!
|Which Keywords Deliver?
The keyword report will show you which keywords are driving the most traffic from the search engines. This is another great tool because it will provide you with an indication of how important a keyword is. For example, if you have a #1 search engine ranking on Google for “snowboard care” and you only receive 500 visitors per month from Google then it is an indication that something is wrong. Either the title or description found on the search engine results page is not alluring clicks OR there simply isn’t enough traffic on this phrase to warrant your time.
Conversely, if you find that you are getting a huge amount of traffic from that phrase, you can tell your SEO to fortify that ranking because you don’t want to lose it. Your SEO will also be able to research other phrases along the same lines that you should target (i.e. “snowboard maintenance”).
If you haven’t run a broken link check on your website for a while this report might be a nasty awakening. The Error Report will show how many errors your website has logged in the last while (whatever time frame the report is set for); predominantly a result of your visitors accessing pages that are no longer available from broken links on your site. This is sure to drive traffic away from your website so fix these immediately!
In addition this report may identify pages that are no longer found under the same name but were bookmarked from other sites. In this case, be sure to provide a redirect from the old page to the new one; see this article for a quick tutorial.
Is There More?
There is much more that can be done using web analytics, even with the more basic reports that are generally provided free with hosting packages. The information that you can drag out of an analytics report is gold, you just need to know how to mine it. Here is a list of additional tutorials I was able to find online for some of the basic packages more frequently used by hosting providers.
But It Can Be Even Simpler
If you are reasonably intrigued by this whole prospect of understanding your visitors I highly recommend you check out our site at Web Site Analytics. This stats site talks about a more advanced but even simpler-to-use analytics tool that will provide you with the information you need in spades. I can’t say enough good things about ClickTracks but here are some immediate bonuses you can expect from using this program:
- Track where your visitors go, step by step, throughout your website.
- Learn which keywords people used to find any page within your website.
- Separately track the behavior of visitors sent to your website through a pay-per-click campaign; you can even track your return on investment.
- Simply and effectively track the quality of traffic that each search engine sends to you.
- The list literally goes on and on.
I highly recommend checking out ClickTracks.com for more information. They have great tutorials and some short but highly informative video demos.
Time to Review Your Statistics!
In closing, whatever you do please take some time to reflect on the analytics for your website; it is an absolute shame that such fantastic data is sitting untapped. If doing this yourself is not an option ask your SEO or Webmaster to do the work for you; you can even contact me for help. The fact of the matter is spending a little time with your stats can mean a world of difference for your bottom line and your visitor’s experience.
Where better to get SEO tips than Google Engineer Matt Cutts? Matt’s blog is certainly no secret in the world of SEO but he is starting to offer his sage advice using a method that offers a refreshing break from reading blog postings; video. Last week Matt posted some answers to common SEO questions using Google Video. Check out Matt’s blog posting where he offers tips on:
The videos are Matt’s ‘beta’ launch of this form of update but I think he did a pretty decent job for the first time out. I hope you enjoy them. I urge you to leave a comment on his post – he is positively itching for feedback I am sure.
PS. I was practically pressing “POST” on this blog when I noticed that Matt had, just 30 minutes ago posted another set of live SEO answers. Enjoy!
We have all been there, “how the heck do they always get #1?” It is a constant frustration for many a client and, well, even myself occasionally. The fact is that much of the time there are a few solid reasons behind the search engine success of any website and it is important to learn what these reasons are before trying to compete. How is this done? Therein lies the subject of this article; how do you determine what your competitor has done to win the search engine war?
Demystifying your competitor’s success requires you to put on your detective garb because you are going to have to investigate all aspects of their website; even the deepest darkest corners. In the following instructional I will lead you through a hypothetical investigation of a competitor who is ranking for the phrase “voip services”. In each step I will choose the more popular result that I find when I do similar competitor analyses professionally. So please take note, the sample is only the most popular result; occasionally there are truly baffling cases of competitor success which have required heftier investigation leading to differing conclusions of what they did to succeed. Read more…
A few weeks back MSN instituted a new Meta Tag that would allow website owners to force MSN to ignore their Open Directory Project (ODP) directory title and description when creating their search engine listing. For many this was a huge boon because their rankings were tied to the out of date, poorly edited or even false directory listings at the ODP. Read more…
A few weeks ago, StepForth’s sales manager, Bill Stroll, took a well deserved holiday. That gave me the opportunity to sit in his chair for a few days, monitoring emails from clients and queries from potential clients. My primary focus was to answer client questions and respond to information requests that simply couldn’t wait until Bill was back at his desk.
Sitting in front of his computer gave me a chance to take another look a random sampling of websites interested in SEO. From time to time, I tabbed over to see some of the site-review questionnaire responses our system had recently handled.
Search engine optimization is obviously becoming more popular. We’re handling a lot more review requests. Many of the sites processed by our review system were already well-designed sites ready for optimization. Many others however, were simply not up to a standard of design or topical clarity in which our SEO services would help. It’s a hard thing to tell someone but someone has to do it, the website needs to be redesigned.
Online competition has increased dramatically year after year. Today there are more websites doing business in every commercial sector than there were yesterday. Though the search engines are better able to sort information and rank relevant sites against keyword queries, achieving search engine placements for smaller sites has gotten more difficult as the environment evolves.
Recent changes to the ranking algorithms at Google and Yahoo place increased importance on site user experience, making people-friendly web design an important component in SEO. Because the search engines want to gauge the usefulness of any given document or link, they track the movement of visitors from one web document to another. When larger numbers of visitors travel to a site and spend more time gathering information and following its links, the search engines tend to think favorably about that site. Similarly, when visitors click in and quickly click out, their leaving is noted and the action is somewhat scored against the site. It’s nothing personal, its just technology judging technology.
When a website is somehow unprepared to meet the standards we believe search spiders or human visitors are looking for, we call it not-ready-for-primetime. It’s a much gentler term to use than others we’ve tossed about. Not-ready-for-primetime sites come in all shapes, sizes and represent all sorts of businesses. The one thing they have in common is that, in their current condition, their chances of achieving strong search rankings are dim. They are often constructed as if they were afterthoughts, as brochures by people focused squarely on doing business in the real world.
When we come across sites that are not quite ready for primetime, we tend to recommend site redesign. The problem with recommending redesign as a pre-requisite of SEO work is that it needs to be factored into a preset marketing budget. Often, site owners are unable or unwilling to invest in site redesign and either go seeking help or affirmation from other search marketing shops, or give up altogether.
The easiest way to avoid presenting an unfriendly face to search engine spiders is to start from the basics and work your way up. Here are a few quick tips on spotting elements of your website that might not be as search engine friendly as they could or should be.
Every website, good or bad begins with a site structure. Some structures are better for search spiders than others. There are two areas to consider when thinking about site structure, regardless of the eventual size of the site. The first how the overall site file and directory tree will look. The second is how the first few levels of that tree will be laid out.
The overall site should be structured to allow for long-term growth. As more files or documents are added to the site, the designer will want to ensure that search spiders will find those files without too much trouble. That means limiting the number of directory levels as much as practically possible.
The first few levels of a site are extremely important for strong search rankings. Documents or files found on the first few pages of any site tend to contain the most unique, descriptive content. These documents are most likely to receive incoming links from other sites and are most likely to be entrance points to specific product or services offered on pages found deeper in the site. Establishing easily followed paths for search spiders and for live-visitors is important.
The next thing that makes a site not-ready-for-primetime is topical focus and clarity of message. In a competitive search engine environment, choosing a theme and sticking to it is generally good advice.
We often see sites that try to sell hundreds of unrelated consumer items or travel services. These types of sites pose two problems. First, there is no overall theme to think about when determining keywords to target. Secondly, much of the content on sites like this is lifted from other online sources, likely already existing in Google’s index.
If these sites were to segment their sites into niches or facets of the industries they are trying to represent and build a number of sites dedicated to those facets, chances are their sites would perform much better on the search engines.
Another series of elements that can make a site not-ready-for-primetime is found in previous attempts at search engine or network marketing. A reality of web-based business is that a little information can be extremely dangerous if applied incorrectly. We often come across sites that have joined web-rings or link-exchanges, or have remnants of spammy SEO techniques left over from a previous run-in with less ethical SEOs. We tend to see these sites just after they have been delisted or have seen their rankings degrade over time.
A site redesign is a serious commitment. Once it is undertaken, a whole range of planning, copywriting and meetings are in order. This process is often good for an online business as it forces the business to focus on how it conducts business online, and how to make that business better.
Perhaps the truest measure of the need to redesign a website comes not from the needs of website marketers but from the experience of the website owners themselves. Is the site producing revenues or attracting business of some form or another? Is it capable of returning some if not all of the money invested in it? If not, the best search engine placements on the Internet are not going to be much use.
The need for search friendly design is obvious, the demand is real.
One of the most frequently asked questions readers and clients email StepForth Placement’s SEO staff, revolves around how websites can be best optimized to meet the algorithmic needs of each of the major 4 search engines, Google, Yahoo, MSN and Ask.
The more things change, the more they stay the same. Though there have been wide sweeping changes in the organic search engine landscape over the past six months, the fundamental ways search engines operate remains the same. Read more…
I am a SEO. As a search engine optimization specialist, I have spent the better part of the last decade studying search engines to get a better understanding of how they work in order act as a guiding consultant for paying clients. My clients, or more appropriately, my firm’s clients, are interested in having their web documents found on the first page of search results across all the major search engines. After spending years traveling trenches full of fiber, my colleagues and I have gotten very, very good at getting those first page placements. If only SEO was really so simple. Read more…
A couple of days ago, I received a call from a west coast reader who works as a corporate recruiter. She had been asked to find an in-house SEO for one of her clients, a medium sized corporation. After recommending a number of SEO/SEM related forums and Ed Lewis’ SEO Consultants Directory, we started to talk about the cost/benefit of hiring an in-house SEO and outsourcing the work to a consultant. As our conversation moved from point to point, a number of issues surrounding hiring in-house SEO talent emerged.
Today the trend leans towards hiring in-house. A quick glance at employment websites such as Monster.com or Workopolis.ca shows a growing list of positions for SEOs who have two or more years of experience. The demand for experienced search marketers far outstrips the supply of really good practitioners, a situation evidenced by SEO salaries ranging from 30K at the low end to over 100K at the top. Read more…