Request a Quotation
 

After publishing last week’s article on “Yahoo Panama – Pros and Cons” I have had a few readers contact me with notable “Cons” they have experienced with the recent upgrade process. While I know I had not covered all possible negatives to Panama in my article, these items are certainly worth adding to the list.

Keyword specific URL’s

Michael wrote noting that you can’t easily manage unique URL’s per keyword. This is absolutely correct. While the ability to specify a unique URL on a per keyword basis does exist, it would be quite a hassle to apply this to any large campaign.

Google has a very straight forward system for unique URL’s. You can bring up a list of all the keywords along with a nice simple field to enter the URL’s, hit submit, and they are all done. With Yahoo, you must do this on a one-at-a-time type routine.

To have unique URL’s per keyword in Yahoo here are the steps you will need to take (starting from your “Dashboard”):

  1. Click on the Campaign Name
  2. Click on the AdGroup
  3. Click on the specific keyword
  4. This will bring you to a page with specific stats on that keyword including a chart. At this step, click the edit link near the top of the page beside “Custom URL”
  5. In the pop up window enter the URL for the keyword and click submit.

After these steps have been entered, you will have to wait for an editorial review before the new URL will be put to use.

While the feature is available, the process is very cumbersome. This is certainly a ‘Con’ that needs to be added to the list. Perhaps the ‘con’ should be changed to include all the changes that require far more steps than are really necessary, after all, there are quite a few of them. This seems to be a common issue with much of Yahoo Panama and I anticipate these types of things will be much more streamlined in the future.

Brand Awareness Just Got More Expensive

The following comment was posted to our blog by ‘Paul’:

“On the CON list, you overlooked the adverse affect to brand advertisers who want to promote something unrelated to the search to create some buzz, awareness, or association with their brand (i.e. Jeep bidding on terms like “beetle” or “bug” to coincide with their ad campaign). The quality score, while beneficial to most, means those advertisers have to spend more to appear prominently where their ad is not relevant.”

I have to say I both agree and disagree with this. First, yes, this is certainly a big ‘con’ for any advertiser that fits into this category, no question about it. This has also been a reality for AdWords advertisers for some time now, and it makes sense that Yahoo would follow suit. If you want to bid on seemingly irrelevant phrases and have your ad appear, things just got much, much, more expensive for you. That said this is also a ‘pro’ in terms of relevant topical advertisements.

By having irrelevant ads essentially cost more, it can have the reverse impact on relevant ads. By driving irrelevant ads lower in the results, it will in a sense create less competition for relevant ads making the top ranking spots less expensive. Under the new algorithm Volkswagen could secure the rank for “beetle” at a lower cost than Jeep, and would have the potential to outrank Jeep regardless of bid strictly due to relevance.

From the perspective of advertisers trying to build awareness by bidding on popular, yet irrelevant terms, this is certainly a ‘con’, but from the perspective of the majority of advertisers, I would have to slide this over to the ‘pro’ category.

Note: I also want to say that for these purposes “irrelevant” refers to a key phrase not directly related to the destination URL. I do understand that while on the surface, a phrase may appear irrelevant, however, when considering target demographics, may make considerable sense.

Yahoo Not Prepared for the Upgrading of Very Large Accounts

I had an interesting letter from a Yahoo advertiser who had very big problems with the new Yahoo Upgrade.

With an annual advertising budget on Yahoo of between $150,000 and $250,000, he found the upgrade to be a complete nightmare.

Under the old system he was running hundreds of ads using thousands of keywords. Many of the keywords were geographic in nature very specific to the ad copy and categories being used. After the upgrade was complete, the account was a complete and total disaster. “Thousands of keywords and ads had been jumbled into completely nonsensical categories, all created by Yahoo.”

Not only were the keywords moved into inappropriate groups, but much of his ad language had also been altered. He estimated that this colossal rat’s nest it would take upwards of 80 hours to correct. As a result he did what any level headed advertiser would do, and called Yahoo.

While Yahoo certainly felt sorry for him and could sympathize, they did nothing to help solve the problem. Yahoo simply suggested that he do all the footwork himself to bring things back in order. For an advertiser with a monthly spend in excess of $15,000, he felt this was plenty of money for Yahoo to assign someone to sort this out for him. I for one agree completely.

After he started the process of re-organizing from scratch, ads which were previously approved were suddenly being disapproved. He called Yahoo again. After being passed off from one agent to another finally he reached someone who told him the ads were in fact running (which they were not), and that no one could tell him what the problem was other than it being a “computer glitch.”

Ultimately, to make a long story short, Yahoo has lost one good and high spending advertiser.

Panama Browser and Validation Issues

Susan wrote in with a few ‘cons’ that may cause smaller scale problems for some advertisers, but problems never the less.

The first is with browser compatibility. Now I don’t use any of these so I will have to take her word for it, but apparently Panama has some functionality issues when accessed using Safari, some older browsers, and is not at all compatible with Mac classic browsers.

She also noted that some of the problems with Panama and access via older browsers is due to the html itself. A W3C Validation check of the new sign in page shows items that are a miss, the most obvious being a lack of declaration of the doctype. While this is an issue common with an incredible number of websites out there, I am surprised to see it with a company such as Yahoo.
After signing on and noticing errors, Susan went on to check her browser error logs and found the following:

https://login22.marketingsolutions.yahoo.com/adui/
signin/loadSignin.do?signt=true
HTML error (5/16): The DOCTYPE declaration is missing.
HTML error (231/12): Illegal character ³/² in tag.
HTML error (237/25): Illegal character ³/² in tag.
HTML error (243/146): Illegal character ³/² in tag.
HTML error (249/73): Illegal character ³/² in tag.
HTML error (323/11): Illegal character ³/² in tag.
HTML error (345/15): Illegal character ³/² in tag.
HTML error (352/16): Illegal character ³/² in tag.

https://login22.marketingsolutions.yahoo.com/adui/

CSS Error (49/135): Unknown CSS property ³zoom².
CSS Error (79/77): Unknown CSS property ³zoom².
CSS Error (105/9): Unknown CSS property ³opacity².
CSS Error (114/9): Unknown CSS property ³opacity².

The server’s certificate chain is incomplete and the signers are not registered

“The certificate on the server has expired”

A script on this page failed to execute. This may keep this page from functioning properly.Statement on line 11632: Expression did not evaluate to a function object:0.addevevtlistner

Now I have to be honest – this goes beyond my level of expertise, however, it is still surprising to me for a site with such a large budget, name, and many months of testing. I would personally expect issues such as these to have been dealt with prior to launch. Perhaps solving these items will allow a wider array of browsers to function properly with Panama.

I am very curious to hear more comments on the new Yahoo Panama. Not only would I love to hear specific ‘cons’, I would also like to hear the ‘pros’. While I do not argue that this new system is certainly with its flaws, I see it overall, as being a positive step. What are your opinions? Please email them to me at scott@stepforth.com. I would love to write another article on the many positive experiences encountered by our readers; after all, there are two sides to every story.

Yahoo has revamped its paid inclusion program and it appears to be the forefront of a push to revitalize this archaic submission format. More on the history of paid inclusion and my opinion later, for now let us review the changes.

What Does it Cost to Submit to Yahoo?
The price to submit a URL is $49 which guarantees (for sites that are accepted):

  • Addition to a database of sites “that powers algorithmic search results for Yahoo! and other major web portals such as AltaVista and AlltheWeb.”
  • URLs are refreshed (essentially re-indexed) on a 7 business day rotation.
  • Access to a new personalized reporting centre that provides:
    - top 10 click through information itemized by the keyword clicked (see sample here)
    - trend charting which provides a visual reference for ups and downs in traffic
    - best practices SEO information Read more…

Answer: Excellent question. The fact is there are a few ways to ensure you are chosen as the primary content provider. The best option is to require that all who syndicate your content provides a inline text link directly to your original posting of the article. For example they would say something like this at the end of the article included in a link: “SEO Answers #15 by Ross Dunn”. This way everyone who repost’s your article is sourcing your content as the original.

If you do not source the original release in this manner the website who publishes your article and gets the most attention for it (via links or publicity measurable by Google) will have a better chance of being chosen as the original content provider.

In conclusion, unless you strictly enforce your ownership to the article you may not get the final credit when Google is forced to guess who published it first.

For more info here is an article on article syndication content duplication I recently wrote.

by Ross Dunn, CEO, StepForth Search Engine Placement Inc.
Original source article and permalink: SEO Answers #15

This SEO answer is a follow-up to a common question I received from my recent article “The Most Common Reason for Dropped Rankings: Duplication“.

To be as clear as possible with my answer I am going to break it down into two sections: one for those who syndicate their own content and the other for those who publish syndicated content.

Please keep in mind, however, duplicate content is not an exact science or anything close to that. As I wrote this article, I often imagined exceptions where a penalty would or would not take place. That said the following answers are based on what my experience dictates to be the most common scenarios. Read more…

Gravatar
Tuesday, February 6th, 2007

A Few Questions and Answers

Online Forums are always a great place to find bits and pieces of information. Below are variations of questions I have found posted in some popular SEO forums, along with my answers.

1 – Q. The last Google update saw a number of my site pages find their way into the supplemental index. Most of these pages have a near duplicate version for “print this page” options. How do I fix this?

A. The answer is relatively straight forward, but it will most likely take some time to see any of these pages removed from the supplemental indexing. The best bet is to first block these printer friendly versions from the search engines. You really don”t want these pages ranking well in the first place as they will likely have much if not all of your main site navigation removed, so having a visitor land on this page would not be of much use anyways. There are a number of ways to block search engine spiders from viewing and indexing a page. Here are two of the most commonly used:


Robots Meta tag
Using the robots meta tag is very simple. To prevent search engine spiders from indexing a given page, add the following meta tag to the start of your <head> section:

meta name=”robots” content=”noindex, nofollow”

Robots.txt file

You can also use the robots.txt file to block spiders. If your printer friendly pages are in a specific folder, you can use this code in your robots.txt file to block googlebot.

User-Agent: Googlebot
Disallow: /printer-folder/

I also recommend adding the “rel=”nofollow”" attribute to all links that direct to the printer friendly versions. This will tell the spiders to ignore the page and the link, which will not only help to prevent the printer friendly page from being indexed, it will also slightly reduce the Page Rank leak. Even if you do use this method, I still highly recommend using one of the other two methods of blocking the spiders to ensure that these pages do not become indexed.

Ultimately, assuming that the original HTML version of these pages has substantial original content, you will hopefully start to see the supplemental status stripped away. Blocking the spiders should also help prevent future new printer friendly pages from causing you more grief.

While taking these steps, it may help, but nothing is guaranteed. The reason pages become supplemental is essentially because you have other pages on your site that are better suited for related rankings. If you have two pages about a specific topic, and page A is highly targeted, and page B is only loosely targeted, then you stand the chance of Page B becoming supplemental. Add original content, and work on increasing links to this page to help out with the supplemental issue.

2 – Q. Does Google Use WHOIS to help eliminate spam from those webmasters with dozens, and even hundreds of sites?

A. Google certainly has the ability to read through WHOIS and flag multiple sites with the same owners. While it is yet to be proven 100% that Google uses WHOIS data to connect spam websites, this is certainly within the realm of possibility, and if they do not use it today, will likely use it in the future.

It is also known that a sites age can help in terms of rankings. Where does Google get this age? It could be from either the day the site was first indexed, or from the WHOIS data. The longer a site has been online, the better its chances of successful rankings, at least assuming a number of other factors such as links, relevancy, etc, all ring true.

We have seen examples where registering a new domain for no less than 2 years can (sometimes) help reduce the time spend in the “sandbox”, as it displays to Google that it is less likely SPAM. Keep in mind of course, that a 2 year + registration is not enough on its own merit.

3. Q. How do I get my site indexed by Google, Yahoo, and MSN? Should I regularly submit my URL?

A. While the answer to this question is fairly simple, it is surprising how many do not have a clear answer. I see this and similar questions in the forums quite often and thought it was pertinent to mention it here.

First things first, do NOT regularly submit your site to the engines. When it comes down to it this is something you will never need to do (nor should you ever pay anyone to do for you). There is only one instance where a submission to the major engines is okay, and that is after the official launch of a brand new site on a brand new domain.

Before you submit your site check to make sure you are not already indexed. You may be surprised how quickly the major engines can find you. If you are not indexed, then one free site submission is alright. After you have made this submission, forget the option even exists, as you will never need to do this again.

To get the site indexed this will typically work in time, but the best way to have your site not only indexed, but also ranked, is to work on your incoming links, and also consider creating and submitting an XML sitemap. Google, Yahoo and MSN are good at finding sites, and they will index you on their own even if you only have a few in bound links.

It is my pleasure to introduce to you Scott Smith, partner at CopyWriting.Net and copy writing guru. Before you read any further, however, I want to give you a heads up that this is not a normal article from StepForth. This is an unedited interview written in a very personal style. In other words, this is not the normal condensed knowledge that my staff and I try to put out every week. That said, I really wanted to introduce you to Scott because I feel his copywriting skills are top of the line and either his services or his tips may help you shore up your bottom line. If you want to get in touch with Scott Smith he is best contacted by email. Read more…

Hello all, I wanted to clear up a significant issue with my recent article “The Most Common Reason for Dropped Rankings: Duplication“. It was edited closely but apparently not quite close enough. Please note the following change in a question within the article: Read more…

Repeatedly my sales and consulting staff find themselves explaining that using duplicate content can and will negatively affect search engine rankings and it is heartbreaking to see clients having to rebuild rankings due to such a simple mistake. As a result, I felt it was time to write this article and hopefully dispel many misled website owners.

Why write an entire article on something as simple as duplicate content? Well probably because it is not as simple as it sounds and many website owners find themselves in the grey area of duplication; where they don’t know for sure whether they are risking rankings or not. Read more…

Gravatar
Wednesday, January 24th, 2007

Wikipedia Links Useless for SEO

As reported in Search Engine Journal, in an attempt to eliminate spamming to Wikipedia, effective immediately all outbound links from the internet giant will have the “nofollow” tag appended. The “nofollow” tag was introduced a while back for webmasters to tell the major search engines to ignore the specific link. When Google sees this tag, the outbound link is passed by as if it were regular text.

What does this mean for site owners? If you have links pointing in from Wikipedia they will be lost, at least in terms of helping with your SEO campaigns. Links come and go all the time, but to lose a Wikipedia link is a big deal as it is a highly regarded site in the eyes of the search engines and its credibility with Google would mean the link would have a significant ranking value. For small sites with few links and good rankings, a loss of a Wikipedia link could have significant impact on rankings.

Internet marketing consultant and blogger Andy Beal is not going to take this sitting down and has launched a campaign in an attempt to reduce Wikipedia’s Page Rank down to zero. He suggests that to dispute the decision that all webmasters who have links directed at Wikipedia append the “nofollow” tag themselves to give Wikipedia a taste of their own medicine. Beal does go on to mention that his site does not have any incoming links from Wikipedia and that this campaign is based entirely on principle.

Wikipedia was made popular due to the vast numbers of incoming links it has gained over the years and if enough linking webmasters adding the “nofollow” tag it would certainly cause it to ultimately drop. Currently Wikipedia’s English home page has more than 1.5 million incoming links noted by Google. It would take an incredible feat to have their popularity decline as a result of “nofollow” tags, but it is still within the realm of possibility.

QUESTION: My client originally promoted a single .co.uk domain that he owned. Recently he purchased a .com and pointed that domain to his current website. Since this change we have noticed his “pages from the UK” content has been dropped from Google UK but the .com is performing well on Google.com under the client’s target keywords. What is going on? – G.S.V.

ANSWER: I see no sure answer why this has happened without more information. First things first, the .com website will get attention from Google.com simply because non-regional TLD’s are favoured at Google.com. Also, the fact that your client’s site got excellent rankings is a testament to the quality optimization of the site (even if you do not want these rankings); so kudos to you if you were the one who optimized the site.

Understanding why the .co.uk dropped in the UK regional rankings seems the tough question. Here are some things to check on:

  1. Was the .com 301 redirected to the .co.uk? A 301 redirect effectively tells the search engines that they should pay attention to the destination domain (.co.uk) versus the domain the spider originally entered at (.com). If you were to enable a 301 redirect now you might save yourself a lot of confusion and potentially pain in the future; since this technique undeniably states which domain represents the flagship website and will limit duplicate content penalties.
  2. Did the .com have a prior history? Perhaps it was bought the .com had a significant number of backlinks or history that outweighed the .co.uk domain. You see, I expect when Google is presented with two domains pointing at the same content it will choose to rank the domain with the most positive history. That is of course, if no other directives have been stated (i.e. 301’s). A way to see if the domain had a history before it was bought is to use the Wayback Machine and see if a prior site existed. Next you should do a backlink check for the domain to see if there are any links that came with the ‘new’ domain.
  3. Is the website hosted in the USA or the UK? If the .co.uk and the .com are both hosted on an American server then achieving a ranking on google.com will be significantly easier than google.co.uk and vice versa. In other words, host in the UK and use a .co.uk domain if you want to be sure to have regional UK rankings.

At the moment these are the most prominent possibilities that come to mind but there are likely more. The fact is, if all else fails and it appears everything is normal I find issues like oddly missing rankings fix themselves over time. I hope your outcome is extremely positive and I do hope you keep me up to date.

If anyone else has experienced this issue or has some educated feedback please post a comment within this posting on The SEO Blog.

PS. Here is a great forum thread at Search Engine Watch discussing Google.com vs. Google.co.uk rankings.

Google Adwords Certified Partner Member of SEO Consultants Directory EMarketing Association