In this episode, you will hear Mark Williams-Cook talking about refinement...
Or get it on:
In this episode, you will hear Mark Williams-Cook talking about:
Product review update: An overview of what is separating the winners from the losers
Google policy changes: PPC changes for e-book publishers
New Bing API: Discussing the new Bing API for URL updates
Google Indexing issue reports: Google testing a new button to report indexing problems
Episode 88 - Search with Candour
Search Engine Journal - Google product reviews
Google Shopping Ads ban digital books
Bing content changes
Search Engine Journal - reporting indexing issues
MC: Welcome to episode 110 of Search with Candour podcast, recorded on Friday the 7th of May 2020. My name is Mark Williams-Cook, and today I'm going to talk to you about the Google Product Review Update, Winners and Losers. So that's actually the update we spoke about last week. I've seen a really great write-up describing what's going on there. We're going to talk about some Google policy updates around digital books and shopping ads. We're going to talk about new Google features around reporting issues with indexing, and not forgetting of course, Bing has some interesting changes with their API and some new things we can do.
Before we kick off, I want to tell you, this episode is sponsored by the brilliant Sitebulb. Sitebulb, if you haven't heard of it, where have you been? Maybe you're new to SEO. Maybe you're an SEO veteran that's very stuck in the tools that you use. If you haven't checked it out, it's a desktop-based SEO auditing tool for Windows and Mac. I've used it for many years now. We've used it in the agency. I guess I'll try and describe it if you've never used it before. It's a desktop auditing tool, like I said, that of course like many auditing tools will crawl your site with selection of user agent.
What makes I think Sitebulb very special among other things is, they've got a really great way of after they've done the crawl, rather than just giving you the raw data back about which links are broken and what status codes pages are returning, they actually do lots of checks for you that try and diagnose actual SEO issues and prioritise them for you.
Obviously, prioritisation is partly the job of the SEO in the context of everything you're doing, but especially for new SEO's, and actually it does a lot of the kind of bulk work if you'd like. And for experienced SEOs it's a really brilliant starting point. There's also other great features that it kind of got famous for, I guess, originally around the ways that they visualise crawls of a site. Now, that's more than just eye candy. It can be a really good way to spot things such as clusters of pages that aren't indexable. And certainly I use those crawl visualisations to get an idea for the link structure of the site, which as we know is very important.
So what can you do about all this? You can go to a sitebulb.com/swc. Because if you go to sitebulb.com/swc, you will get an exclusive 60-day trial with Search with Candour listeners, which means you can try out Sitebulb, there's no credit card or anything required for two months, and see if you like it. So no excuse not to. If you haven't checked it out, do so now.
Last week in the podcast I spoke briefly around the Google Product Review Update. So this was the very specific update that Google launched, not a core update that was specifically looking at changing the way they rank pages that are doing product reviews. And we actually went through the advice that Google gave in the update around the kinds of things that they expect to be in-product reviews.
Now, before recording this podcast, I came across a really interesting article by Mordy Oberstein. So you might recognise the name. Mordy was on the podcast a few dozen episodes ago, so episode 88 now. So Mordy's the search liaison for Wix and we had a nice chat there around Wix, the SEO of that platform, the reputation Wix have, what they're doing for SEO. So I'll link to that in the show notes at search.withcandour.co.uk, if you want to listen to that episode.
But Mordy had researched and written an article for Search Engine Journal, which again, I will link to, and he had done some analysis of the outcome of this Google Product Review Update amongst various review sites. Now, I'm just going to read out the methodology he used here, because it is different to a lot of "studies" that we see.
So Mordy says, "When analysing Google updates, I try to focus on the page level. I do value big data and it can certainly be helpful in understanding which niche an update impacted. However, I like to see what sort of content Google is now rewarding and which types of content are being devalued. To do that we look at relationships between pages during an update. That is, for a given keyword, are there pages that have an inverse relationship? If one page shoots up the SERP and another falls off, what's the fundamental difference between those two pages from a content perspective? If you look at enough keywords, it's sometimes possible to pull patterns out of these inverse page relationships. In this particular case, I went through a few 100 keywords looking for inverse page relationships. Out of those keywords I found roughly two dozen that showed a very clear and distinct inverse pattern between ranking pages. To be clear, what I'm about to share is based on my qualitative analysis, it is not a definite definitive study based on deep data."
And he goes on later to say, "However, with this particular update, the content patterns were pretty obvious, at least when compared to other updates. So I think there's a lot of value in this type of qualitative research. So especially when lots of people do it, because you can really start to see these commonalities and it's very hard taking a big data approach to, apart from see the trends, understand why they're happening. Again, of course, as I said, I'll link to the full write-up here, but what Mordy's doing is actually going through these pages and reading them as a human and trying to understand them and picking apart the differences.
And why I put this on the podcast and why I want to talk about it is, I just found some of his conclusions actually really interesting here. So he goes through and he uses Rank Ranger to find these inverse relationships, which as he said, "He's basically he'll find a specific key word where their rankings have dropped off and another website has essentially kind of leaped in there." So his first example is a search term, best built in microwave. And he's found a site, stjamesgate, that's kind of dropped right off from top ranking, top three down to page two. And another site called BestReviews that's jumped from bottom of page two straight up to number three.
And the interesting thing, when you go through Mordy's analysis here is when he looks at the losing page, he immediately notes that the content isn't thin if you like, or anything of the sort. So very much not what you'd expect with a Panda mindset. So Panda being the very old now Google update that was targeting thin content. And he goes on to say, "This page that's lost the rankings has a nice writeup." And he's put some screenshots here, there's a lot of content there.
What he's highlighted there is, apart from this nice intro that they've given, the problem starts he thinks for them on the next section of the page, which is, essentially we go straight into what we would call affiliate marketing or sales language, so it starts to list products where you can get them on Amazon. And they're saying things like, "Achieve fast and delicious results with your recipes. Why not go the extra mile with this excellent built-in microwave oven? This microwave oven is a blast to use and will be an excellent addition to any kitchen."
And what he's noted between the two pages here is, and this is something he's picked up through these different sites is that it's quite obvious that the type of content he's saying that is losing is basically a sales pitch. And the type of content that seems to be winning is much more kind of neutral and informational. And I think this just gave me a moment to reflect on, obviously we've seen amazing things with text generation like the GPT-3 stuff, and certainly what Google's been doing with machine learning. And just thinking about how Google is classifying on a granular level types of content.
So even now with the available machine learning libraries, there's tutorials out there where you can download Amazon reviews and you get the review text and you get how many stars that person gave the product they're reviewing. And you can train a machine learning model based on that to very accurately then predict, it can read an Amazon review and predict how many stars that person gave the product, just based on the language. And obviously this is a super basic, off the shelf, not trained or run by an expert kind of model.
So you can imagine how this is being applied. The data set that Google has is pretty much divisible web, and they've got pretty smart people working on this content analysis. So I've found this thought really interesting around the type of pages that were winning wasn't really down to anything to do with the structure of the page or anything quite as basic as the content is thin or it's just repeated. It actually came down to, are they providing everything that we mentioned in the last episode that Google was talking about what makes a good product review?
The only thing I found in Mordy's write-up that he wrote as a specific thing that generally seems to help if it's present was a buyer's guide. So he did find that pages that included information such as a buyer's guide tend to rank well. And actually that was something again Google did specifically mention when they were talking about product reviews. So I find this one of the more interesting Google updates as it is very specific, and it seems to go very much into what I would say is approaching at least from a analysis kind of point of view a human level of trying to work out, is this content good or bad, or am I just being sold to? Because we've all experienced those sites where we look at them and they're just trying to push you to make that sale.
And I can see Mordy's frustration in this article when he's looked through these pages and it's not immediately obvious what the difference between the pages is. And he's like, "Yeah, I ended up actually having to read the content on all of these sites," which I think is impressive from a ranking point of view. But my takeaway from why I've mentioned this; why I think it's important for SEO is, we really, really need to be focusing and, I mentioned in the last episode to have that guide, give that to the people that are doing your product reviews. This needs to be your basis for what you're doing, it's no longer enough just to have some content and then try and push the sale. As I said, I'll link to that. Write up if you want to read it in full. I think it's really, really interesting.
What could be more exciting on a podcast than a Google policy update? This is happening on the 18th of May this year. I'll link to the news article about it on the Google Merchant Center Help website, and it's that digital books can no longer be advertised on shopping ads, so this is a PPC update.
What's changing? Beginning on the 18th of May 2021, Google will no longer support the advertising of digital books globally on shopping ads. This means that all shopping ads for digital books will be disapproved at the offer level, including those running at the time that when the policy comes into effect. Only digital books will be disapproved. This includes PDFs, EPUB books, mobi, and there's a word in Mandarin, I assume I can't read, formats. "There'll be no change for physical books or audiobooks. If any physical books or audiobooks are disapproved for being incorrectly classified as digital books, please request a review in the Merchant Center. We don't expect this to be a common issue and we'll address it swiftly if it happens.
At present Google cannot provide the best user and publisher experience to meet the high standards for digital books in shopping ads. While we understand this negatively impacts those who advertise digital books on shopping, we believe that this is the right decision to protect users, publishers and the shopping ecosystem."
And then they say, "What's not changing. This update only applies to shopping ads for digital books. Buy on Google listings, which already prohibit the sale of digital products won't be affected. Listings for other types of books, like audiobooks or physical books won't change."
So I'm kind of behind this policy update, because from what I've seen, the standard of a lot of eBooks to be honest is quite low. I think even on platforms, digital book platforms, audiobook platforms like Audible, when you do searches of various subjects, you have kind of published books, if you like. And then a lot of self-published eBooks, which the quality, let's say, there's a big range in quality there. So I can understand why Google's doing this.
Why I wanted to bring it up was something very, I caught something very interesting about this, which is that the organic Google shopping listings will not be affected. So if you remember, Google is rolling out free organic shopping listings, which means, if you have a product feed to Google, and you've submitted that through Merchant Center, you don't always have to pay to get that inventory seen. Now I saw this question, and the answer was that this organic visibility won't be affected. So this means, if you do, if you are selling digital books, eBooks, whatever, and you have them in your product feed, I wouldn't necessarily remove them, because there's still a chance that they're going to appear in organic shopping results. So this change is specifically talking about shopping ads, which are the pay for ads. So I don't know if Google is going to tweak that or that would change, but at the moment that's the situation, and I thought that was worth sharing.
I really love it when Bing does something, because it gives us a chance to cover them on the podcast as well, and it doesn't just become a 100% Google-centric podcast. So Bing has released a new API, which is quite interesting, which is around letting them know when content has changed. So they've done a press release on their blog, which I will link to. And I'll just read you the spark notes about it now. At Bing webmasters don't have to wait to get their content crawled and indexed. Bing offers webmasters the ability to tell Bing about the latest changes in their sites, such as providing latest added, updated, or deleted content and URLs. Bing already supports the ability for webmasters to notify Bing about URL changes by its Bing URL Submission API.
But now, and this is under beta launch, it's also the ability to notify Bing directly about URL, along with content changes via the Bing Content Submission API. So let's not get those two things confused. There's two separate things here. There is the Bing URL Submission API, which has been around a while, and the new thing is the Bing Content Submission API. This will not only help webmasters to reach more relevant users on Bing, but it will also reduce Bingbot crawl load on their sites. I think that's quite interesting, because from the crawl analysis I've seen Bing tends to be, at least in the Western hemisphere, one of the more hungry bots out there.
This blog post will provide a generic overview along with step-by-step instructions on adopting the same. So we won't go through the step-by-step instructions here, but basically it is in beta, so you will need to fill out a form to apply for this API access. And what you can essentially do then is let Bing know when content on your site is actually even updated.
So this is something that I can see SEO plugins, like maybe Yoast, or we've got one for Statamic called Aardvark SEO, will be doing. So if someone goes into a previously published page or a previously published post, if that's updated, apart from doing all the kind of usual API calls, so things like if you're using CloudFlare to clear the cache for that page, we can actually then start using this API to let Bing know, "Hey, the content on this page has updated." And that's going to be, I think, quite interesting, because we certainly have seen over the last few years this definite set of tactics around keeping page URLs the same and improving and building on content over time, rather than publishing lots of different new URLs. And certainly in specific instances we have seen where improving and refreshing pages of content has let Google, or has made Google rank them better. So I think that's particularly interesting. You can apply, as I said, for access to this, and hopefully they're going to roll that out for everyone soon.
This last one is an interesting one to finish on. So for long time listeners to Search with Candour, you will know that since we launched in 2019, we have reported, I think at least four or five times in 2019, in 2020, in 2021, about Google indexing issues. And these are major indexing issues, whereby lots of people are noticing them, and thus it becomes news and trickles down to smaller podcasts, even like this, for us to talk about. I'm sure there are lots of indexing speed bumps that we're never aware of as users that Google are detecting and fixing.
So I found this piece of news quite interesting, which is that Google is launching the ability to report indexing issues within Google Search Console. This is currently only going to be launched in the U.S., and it means that signed in Search Console users will see a report, an indexing issue button, under the Index Coverage Report and URL Inspection Tool articles in Search Console Help Center.
So as I said, this is a pilot test, it's going to be rolling out to everyone in the U.S. within a week. And what will happen is, you click on this button that you're reporting an indexing issue, and you're then going to have to pick between two ways to describe your issue, which is, "My website or web pages are not indexed in Google Search," or secondly, "My website or web pages are indexed, but aren't ranking appropriately in search results." I'm trying not to laugh here, because I cannot imagine the volume of reports that Google is going to get from people saying that their web pages are not ranking appropriately in the search results. So much so that I've seen, we've seen the team at Google like Gary Illyes tweet a few times things such as, "Just because Google crawls your pages, doesn't guarantee that they're going to get indexed. Google wants to see what it judges as a certain level of quality before it decides to include that page in its results."
Because I guess that they're getting a lot of feedback around, "Hey, Google crawled my site, but my pages aren't getting indexed, why not?" And I can see the frustration as well that they're also saying, "Hey, look, it doesn't matter how many times you request indexing of a page, if we don't want to index it, we're not going to index it. And just filling out the request in the index box and hitting the request index isn't going to change things or speed things up."
So there's definitely been people looking and being like, "Well, I've done this page, it's not indexed. I'll request indexing. Oh, it's still not working. I'll request it again." And I can only see that opening up feedback for my pages are ranking, but not well enough. I can't see how this is going to work out well for Google. So I'm not really sure what they're trying to achieve here. The only thing I can think is that when they spoke around the indexing issues that they did have, and obviously there was a fairly dramatic impact. We were talking about, I think it was like half a percent of their index. I'd have to go back and check, and get dropped at one point, which is a mind blowing amount of pages to disappear from the web.
And however Google internally was tracking those index issues, obviously that managed to happen and get through without setting off enough big red, loud alarms that were alerting people to an issue. And they were obviously getting feedback from people. So I could see that maybe they're having this button to give them that earlier warning, and give people a clear route rather than moaning on Twitter or a Forum about pages not being indexed or dropping out of the index to try and give them like a plan B.
So I'm sure they've changed things that trigger alerts on their end, or that backend stuff. But yeah, that button, I'm not sure how helpful it's going to be based on what sounds like the quality of feedback they're getting, but hey, what do I know?! It's going to be there. So if you've got indexing issues, you're in the U.S., click on it and you'll be able to do that. And if the test goes well, I don't know how they're measuring that, but it will probably be rolled out to everyone.
And that's all I've got time for this week. I'm going to be back on Monday, the 17th of May with another episode of Search with Candour, so do join in. As I mentioned last week, at Candour we've got a whole bunch of roles going at the moment for an SEO specialist, for a PPC specialist, for an Account Manager. So if you're interested in them, check out our website, you can just Google ‘Candour Agency’. Otherwise, I hope you all have a fantastic week.
In this episode, you will hear Mark Williams-Cook talking about refinement...
In this episode, you will hear Mark Williams-Cook joined by Emily Brady...
Get in touch