Summary of SEO 101 Episode 369

The SEO 101 Logo for WebmasterRadio.FMRoss and guest co-host Scott Van Achte discuss the latest global search market share statistics, Christmas ads, when to act on bad links, Google Maps user profile SPAM reporting, and a lot about the significant havoc within Google’s local SEO results in November.

Lastly, they answer a few questions from listeners that were posted on the SEO 101 Podcast Facebook group.


Notes from SEO 101 Episode 369

Search Market Share (

October 2019
Google: 75.49 %   (1 in 1.32)
Bing:      9.89 %    (1 in 10.11)
Baidu     9.20 %    (1 in 10.87)
Yahoo:   2.82 %    (1 in 35.46)
Ask:       0.56 %     (1 in 56)
AOL:      0.05 %     (1 in 2000)

Google Removed Ads for “Christmas” – Search Engine Land
During the last week of November, Google removed suggestive/inappropriate ads that were being displayed for the search term “Christmas.”  Now the page results show no ads at all and just the basic knowledge graph.


Local SEO News Segment (Weekly)

“Bedlam” Algorithm Chaos in November – Sterling Sky
The Local Search Forum had many mentioning 8 position drops in local and few increases. Some are concerned that the daily fluctuations are the new normal – where local rankings are now more focused on day-to-day signal variations.

Google Post – Beta Product Category – Local Search Forum
When you add a GMB Google Post about a product you can specify the name, and for beta users, you can now select a category. This is being seen across a number of accounts with beta access.

According to Joy Hawkins, there used to be product category descriptions which are no longer shown and all previous descriptions will be removed by January 1, 2020.

Google Maps Allows Users to Report Spammy Users – Local Search Forum
To do this, you will need to use your phone: Click on the name of the reviewer of a location, then you will see his/her list of all past reviews. If you click on the menu icon at the top right of their listing you will see an option to “Report Profile” at which point you can select from some prescribed reasons.

Believe it or not, this incredibly basic addition is an improvement over the previous options which were to go to each review and report them separately.


The Mueller Files

A photo of Google's John MuellerOur weekly section of commentary from Google’s John Mueller.

John Mueller Discusses When to Act on Links – Search Engine Roundtable

John Mueller talked about the November core updates in terms of links. Essentially, you shouldn’t assume a lot of spammy links to your site are being ignored… clean them up or disavow them if you have a lot.


Questions from SEO 101 Listeners

Ross and John also welcome some SEO-related concerns from listeners. Please listen to the show to find out their detailed answers to these questions.

Q:I’ve been doing SEO off and on for the last 20 years, and was curious if anyone has had any significant success with real estate clients. Ever since Google decided to nationalize a majority of the high-value keyword phrases, it’s been hard to crack to top 3. I’m determined to take Zillow and Trulia DOWN!” – Ben J.

“Have we had any significant success with real estate clients? Yes, but it’s not the norm. It is very difficult.” – Ross

Q: “Currently working with someone who has around 120 pdf’s in the index. Most of these pdf’s are press releases without any useful information for the end user (at least no valuable Keywords are being target). I would like to de-index all of them.

Do you think this is a good idea or should I do it step by step. Is there any advantage in indexing PDF’s?” – Lukasz A.

“If they’re just press releases, that doesn’t mean they shouldn’t be indexed.” – Ross

“Another factor to look at, too, and that is what percentage of content are those PDFs part of your site.” – Scott

Q:Look at this wizardry… has anyone else seen any examples of results like this? (referencing “view in 3D” rottweiler showing up for a search for Rottweilers.)” – Mark C.

“I have seen it elsewhere; I can’t remember what it was for.”  – Ross

Q: “One of my clients has very recently launched a new website using a separate web agency. They launched this new website yesterday to which I found out for the first time the website is 100% JavaScript…

This has made it very difficult to crawl using SEO auditing platforms/software (Moz, SEMRush, Screaming Frog etc). I was wondering two things…

1) Are there any tools anyone can recommend to help audit JS websites?

2) Is there any benefit to having a JS website or in this day and age is it bad for SEO? I don’t see many of them around anymore personally.

Any help or advice would be great.” – Dale O.

“Give (Sitebulb) a try.” – Ross

“There is a way to crawl a Javascript website using Screaming Frog.” – Scott


End of Show Notes

If you have any questions you would like to share with Ross and John, please feel free to post them on the SEO 101 Facebook Group. And, if you enjoy SEO 101 on WebmasterRadio.FM please consider supporting the hosts with feedback on Apple PodcastsStitcher, or your favourite podcast stream.