Episode 87: GMB review bugs, May 2021 page experience and Black Friday optimisation

Play this episode:

Or get it on:

What's in this episode?

In this episode, you will hear Mark Williams-Cook talking about:

  • GMB review bugs: A bug affecting the display of Google My Business reviews

  • May 2021 page experience: Google gives dates and more information on their page experience as a ranking factor

  • Black Friday optimisation: Sharing Luke Carthy's tips for optimising your e-commerce site for Black Friday

Show notes

Google My Business review issue:

Page experience update

Black Friday tips by Luke Carthy

Episode 63: Core Web Vitals as ranking factors and Discovery Ads

Confusion between Lab / Field, even lab tools, plugins if run from browser.

Episode 56: e-commerce insights with Luke Carthy


MC: Welcome to episode 87 of the Search with Candour podcast, recorded on Friday the 13th of November 2020. My name is Mark Williams-Cook, and today I'm going to be talking to you about some of the bugs in Google My Business reviews. We're going to be talking about the Google page experience update. We've got some more information on that, when it's rolling out as a ranking signal next year, and some tips to make sure your ecommerce site is as good as it can be for Black Friday.

Before we get going, I would love to tell you this podcast is sponsored by the very kind people at Sitebulb. Sitebulb is a desktop based SEO auditing tool for Windows and Mac. If you've listened to this podcast before, you will know I like Sitebulb, I use Sitebulb, my agency uses Sitebulb, so it's not just something that they pay us to say nice things about them, because I have lots of nice things to say about them and so do other people.

This morning just before we recorded this podcast just said “who uses Sitebulb, what's your favorite thing about it?” on Twitter, because I normally like to talk about one of their features We got loads back saying “app has a nice interface”, “they're always adding cool new features”, “release notes”, people like the release notes. If you haven't seen Sitebulb release notes, go look at them, just trust me on that. Um, visualizations and schema validation, both things I've spoken about before in the podcast. Exporting straight to Google Sheets, that's the one we're going to talk about today. That's one thing Sitebulb does that I absolutely love, which is, you can export the data you're getting from Sitebulb straight into Google Sheets.

Now, a lot of reporting, certainly we do, at our agency, is done through Google Data Studio, because only idiots waste time, you know, taking screenshots and building like powerpoints or whatever, and manually populating spreadsheets and waste loads of client time on reporting, rather than doing things that actually create value. Any kind of automation, time saving in reporting, when it comes to just exporting and pulling together numbers, is really helpful. We do loads of stuff with Google Sheets now; I'll never be one to fully ditch Excel, it is really helpful for some of the larger data manipulation. But when it comes to using stuff like Google Data Studio, pulling together different data from different sources and putting it into reports, Sheets is where the data needs to be. That's one amazing thing of Sitebulb, you can connect it, directly export this stuff to Google Sheets. So if you've got lists of, like broken links or things you need actioned within a team, you can just export it directly, make sure everyone's got access and you've immediately got the data in the right place, you need it.

Sitebulb, we've got a special deal for Search with Candour listeners; if you head over to, you can get an extended 60-day trial, no credit card required. So you can give it a whirl if you haven't tried it before, no excuses, go have a look.

I kind of like it when things break, especially at Google. Not because I like it when things aren't working, but because it does highlight how complex everything is, and how difficult it is, even for one of these huge companies, with lots of very talented people working at it to keep all their products working and functioning. It seems we are at the end now of the canonical and indexing issues, they all seem to be cleared up now; everyone I've seen who was posting screenshots of pages dropping out of the index, etc, that all seems to be corrected. However, I can bring you one new foible in Google's products, which is to do with Google My Business, and specifically Google My Business reviews. Now the review system on Google My Business has always interested mem because I've always noticed, personally, that there is a delay between when I know somebody's left a review, to when I can see that review in the the back end of Google My Business, versus when it's publicly published, and also when the number of reviews and the average rating updates. I find it interesting because it gives you an idea of what's going on under the hood, behind the curtain there, in that these are obviously different systems, there's like an overall review star rating counting number of review system that falls in sync with how many reviews there are. So it's not just a couple of rows in a database, obviously with so much data, but there is an issue at the moment it appears with Google reviews, on Google My Business. This was blogged about on the GatherUp website. GatherUp itself is actually a customer experience review type product, and they've put a blog post up saying that, since Tuesday, so that's the 11th of November, there appears to have been an issue with users leaving new reviews, and them not being publicly viewable or accessible. Google has confirmed that this is an issue and they're hoping it's going to be resolved over the next few days. GatherUp did some interesting data pulling to highlight this issue. They pulled together the amount of new Google reviews that were being left on a daily basis for a large restaurant chain. This was averaging between 800 and 1400 new reviews per day, and on the 11th of November this dropped right down to what looks like around 50 new reviews, and then none on the following day, on the 12th. That goes to show there is quite a wide scale issue there with new reviews.

Again, like many of these Google issues that are affecting their products, there's not a lot you can directly do about it. It seems that the reviews are going into the system, so if you've had these reviews you will be able to see them when you log in, but you won't be able to see them if you Google your company or brand name and bring up your company panel, it won't be listed there. So they are still there, you haven't lost the reviews, they should come back, and Google's hoping that fix is going to take a few days. If you've had customers leaving your reviews you may be, as part of your process, hopefully that's part of your process, when you when you're serving customers selling them whatever it is you sell them, serving them, that you've got this feedback loop where you're asking them to leave reviews. If you have seen a dip in the amount of views, that may well be why, so don't panic about that it should resolve over the next few days.

We've got some more news, directly from Google about their new way to measure page experience. Well, it's not quite new, we've talked about it before. It was first announced in May, when Google started talking about the core web vitals. We covered it a couple of times, if you look at the show notes, which are at, you'll find a link to episode 63, where we talked through in detail about the core web vitals, what the specific metrics are, what those metrics mean, and how they're going to be used as ranking factors. So, if you don't know much still about core web vitals, or you haven't got around to learning about it, that's the place you need to be.

In the original announcement back in May, Google said this is their new way of measuring page experience, or I should say additional metrics they're going to be using to measure page experience, and they specifically said that those new metrics, so the three new core web vital metrics, weren't used directly in their ranking algorithm, at the moment, but they were planning to use them in 2021. They very kindly said, before they rolled out any such change in the algorithm they would give all of the webmasters, all of us, at least six months notice, and it would appear that notice is now and they're giving us the minimum six months notice they said they would, as there's a blog post on the Google Webmaster Central blog that talks about how, in this May, in 2021, those core web vitals are going to form part of the ranking algorithm. I'll just go through this post and I'll link to it in the show notes at

They've done a post called “Timing for bringing page experience to Google Search”. This past May we announced that page experience signals would be included in Google Search ranking. These signals measure how users perceive the experience of interacting with a web page, and contribute to our ongoing work to ensure people get the most helpful and enjoyable experiences from the web.

An interesting point there, that they talk about how people perceive the experience. So when SEOs, when web performance experts, are dealing with things like site speed, it is important to know, as many people before me have pointed out, that site speed does not necessarily equate to performance and certainly not experience. To give you the most basic example, you might have a page that takes, we'll say three seconds to load and, if between zero and three seconds, the user doesn't see anything, and then it all loads in suddenly at once, at three seconds, the user might perceive that as a slower experience than if a page actually took four seconds to load, so a third longer but, after half a second, maybe some of that content, some of that page, started loading in, maybe the body content, the text, so they could start getting an idea, reading, getting their bearings, as everything else loaded in.

That's just one example, whereby the total time of loading the page isn't necessarily as important as more nuanced ways to measure experience, and that's one of the things that these webcore vitals are tackling. Rather than just looking at page speed, page load time, as a metric, they're looking at this largest contentful paint, how long does it take for the main bit of the page content to load, not necessarily all of the bells and whistles that go within that. Google say “these signals measure how users perceive experience of interacting with the web page and contribute to our ongoing work. In the past several months we've seen a median 70% increase in the number of users engaging with Lighthouse and PageSpeed Insights, and many site owners using Search Console's core web vitals report to identify opportunities for improvement.”

That's really interesting for me because that's a huge increase. Google is quite good at motivating webmasters essentially to do what they want when it comes to development performance. Certainly we saw the same thing with http and https, so Google was one of the major drivers, although SEO is, in my opinion, a secondary consideration for why you'd want https, they certainly were a driving factor for web development SEO teams to go back to stakeholders, people that control the finance of these sites, and say, “well look, actually, it's a small part, but it is a ranking factor now, so this could help us among you know, with other things, to basically make more money.”

So those changes were happening, and I think we might see the same things here and it's kind of on its head, because as anyone working the web will know you improve this page experience anyway you would expect better outcomes, you would expect higher conversion rates, you would expect better return from your ads. SEO is almost the secondary reason you should, well it is the secondary reason in my opinion, to be doing these things, but SEOs have, over the last decade, got quite a stakehold now in lots of major sites, so adding this to their list of priorities has certainly helped over the last few years.

Google go on to say “today we're announcing that the page experience signals in ranking will roll out in May 2021.” This is saying, from May 2021, these additional core web vitals are going to be used as signals for page experience. So the existing signals, they've got a great image, which we'll put in the blog post, as we know, currently include things like “is the site mobile friendly?” or, I should say, “is the page mobile friendly”? “Is it safe browsing?” https, no intrusive ads, they're already signals that Google takes into account for page experience, and they're adding in these metrics that look after loading interactivity and visual stability. So largest content for paint, first input delay, and cumulative layout shift, which is still my favorite because I still, despite being a veteran web user, I still get caught out by sites where everything's kind of loaded in, I go to click on something, and then the page jumps around and I end up clicking, normally on an ad or just going somewhere that i don't want to go.

There are other important things in this post though, which we'll go through as well. The post goes on to say, “the change for none amp content to become eligible to appear in the mobile top stories feature in search” will also roll out in May 2021. That's really important, this means “any page that meets the Google News content policies will be eligible and we will prioritise pages with a great experience, whether implemented using amp or any other web technology as we rank results. In addition to the timing updates described above we plan to test a visual indicator that highlights pages in search results that have a great page experience.”

There's a lot to unpack there. Google pushed hard on amp for quite a while now, these accelerated mobile pages, this cut down fast to load format met, as we've previously discussed, with quite a lot of pushback from the web and SEO community for all kinds of reasons, and some good reasons there, and what they're saying is, in May 2021, this mobile top story, so when you're searching on a mobile in the top stories carousel at the top, this won't be restricted to accelerated mobile pages. So Google's saying, now we've got these general metrics and that's the important thing about these web core vitals; I've seen various criticisms of why they're using those metrics, but when you try and actually think, yourself, of what is a metric that you can use, generically, across any website to to get an idea of the page experience? It becomes very difficult. The one I've seen people suggest is things like “well, you should look at the bounce rate on the site” and apart from a lot of confusion over what people believe bounce rate to be, and how Google would actually measure bounce rate, because it doesn't necessarily mean returning to the SERP. The bounce rate isn't a good indicator of if the page has a good experience because, if you're searching for some specific piece of information, like how to change you know batteries in a children's specific children's toy, and you Google that and you go to a page and it's got some content that shows you exactly how to do that, and you do it, and you don't do anything else, you have 100% bounce rate, because the person hasn't done anything else on the site, but they've had the perfect user experience. They've Googled something, it's found the exact page for them, probably the right place in the content now, as we know with content highlighting they've gone there they've done the thing and what is the optimal bounce rate? There isn’t a generic for that. Using these webcore vitals is a way for them now to have this generic measurement of “is the page experience good?” so they don't have to lean as heavily on “is the page amp or is the page not amp?”

So I think that's a really positive change for everyone, it certainly reduces the bar in terms of development costs, things like that for sites to appear in those results. The other thing they've mentioned there, as just an additional note, is this visual indicator that highlights pages in search results that have a great page experience. Now that's something they're going to test, I'll be really interested to see how users do react to that, because it is annoying when you when Google does choose to rank a site that obviously is poor in terms of experience and speed, and you just end up abandoning it. I'd be very interested to see how these labels will affect click-through rates in search, and therefore, if it's yet another reason that you will want to make these changes. So not only improving conversion, increasing efficiency on your site but actually getting the traffic in the first place because, if you're ranked number one but the number two ranking has this label saying “it's basically a really good fast website”, how's that gonna impact the the paradigm of “well i'm number one so i'll get more traffic”. The post goes on to say a new way of highlighting great experiences in Google Search, “we believe that providing information about the quality of web pages experience can be helpful to users in choosing the search result that they want to visit. On results, the snippet, or image preview helps provide topical context for users to know what information a page can provide.” Another reason probably why Google's very keen on overwriting meta descriptions. “Visual indicators on the results are another way to do the same, and we're working on one that identifies pages that have met ALL of the page experience criteria. We plan to test this soon and, if the testing is successful, it will launch in May 2021. We'll share more details on the progress of this in the coming months.” The post then ends on the tools that you need to improve page experience. I'll just read this out and give a little bit more detail around this. “To get ready for these changes we have released a variety of tools that publishers can use to start improving their page experience. The first step is doing a site-wide audit of your pages to see where there is room for improvement. Search Consoles report for core web vitals gives you an overview of how your site is doing and deep dive into issues. Once you've identified opportunities, PageSpeed Insights and Lighthouse can help you as you iterate on fixing any issues you've uncovered. Head over to, again we'll link to that in the show notes, for a roundup of all the tools you need to get started. Then they go on to plug amp some more, which I'm not going to read.

Now the resource, as i said I will link to that in the show notes at], and I think that's worth reviewing, just because I often see a lot of confusion with these performance measurement tools, both with SEOs and developers. Normally you'll get someone run some kind of speed report, it'll get sent around and someone will do another one, and it will come back different, and someone else will have another one, and that will be different, and it becomes very confusing as to how these pages are performing. The vital tools resource does actually give a really nice breakdown of this and I haven't seen this documented as well before this now from Google. It goes into very clear detail about the difference between things like lab and field results. Obviously lab results are essentially when you're running the test locally yourself and field results are things like the data that Google gets from opted-in Chrome users. If you have a site, a domain that's got a fair bit of traffic you'll likely have access to field data, which can be really powerful because this is what real users, on average, are experiencing on your site, its very very powerful data. If you're running lab tools, you tend to get, even on the same machine, if you run them three or four times, you'll get different results. That's to be expected; sometimes servers respond a little bit slower, sometimes the internet's a bit busier, and that's to be expected. But there's other complications as well that sometimes get overlooked.

For instance, if you're running some of these performance tools from your browser, even having different Chrome extensions installed can affect how the performance report comes out, because if you've got Chrome extensions that are maybe highlighting certain bits on a page, or blocking bits on a page, that's going to affect how long it takes for that page to fully load, because there's some processing happening at your client end, and that's going to be recorded by that tool. There is a certain hygiene factor you need to take into account when you're running these lab tests. You need to understand the pros, cons, differences in when you might use field data when you might use what's called RUM, which is the “real-time user monitoring data”, and when you might want to do these one-off product with a lab test. So that blog post is It's a really good breakdown of those different tools and when to use them, and the different types of data you've got, so very much I would encourage you to share that as well with developers, because developers normally will have their own way they like to measure performance, sometimes using the Chrome developer tools, but there's lots of different ways they can be doing that, and maybe get more helpful insights.

Despite lockdowns, it's probably not escaped your notice that we are approaching peak season for e-commerce; Black Friday, Cyber Monday, Christmas are all coming up. I imagine if you're working in ecom in-house, or you’re agency and you've got e-com clients, it’s something, hopefully you've been planning for a while, and probably scratching your collective chins a little bit about how things are going to be different this year with Cyber Monday being normally the biggest digital trading day, and Black Friday itself obviously being huge, and how they're going to play together now with the high street being a lot quieter this year, and where that where that money is going to go, and what it's going to look like. A few months ago we had Luke Carthy on the Search With Candour podcast. It was episode 56, we'll link to it in the show notes, and Luke, who's an e-commerce specialist gave us some really good insights at the start of the lockdown. This was back in March/April time about what e-commerce sites can do to help prepare them for these extended lockdowns, for the changing consumer behavior, and what the wins are they could get on the e-commerce site. Luke, in his ever helpful fashion, has done a post giving some of his tips for high performance campaigns over black friday. Now, the post is a little bit longer than what i'm going to go through so, again, of course I'll link to it in the show notes I'd encourage you to go and read it if you're working in ecom, but I just wanted to read his takeaway top five tips for high performing campaigns over Black Friday, because i think they're quite valuable if you haven't thought about them before, and there's some SEO tips in there as well.

Luke's first tip is 1) “Use the same Black Friday url every year. Luke says “I've said this before and I'll say it again, when it comes to SEO and page equity, there's nothing that hurts organic visibility and rankings more than using a new URL every year. For example, and then using /blackfriday2020 for this year. Recycle the equity the page has built by using the same URL. To maximise ranking potential and precious authority even further, consider using the same URL for Cyber Monday too. If you've previously used different URLs, use one this year that's most likely to be timeless, so avoid adding dates and other time sensitive data to the URL, and then 301 redirect all previous Black Friday URLs to your new one”. Really good bit of advice there. Even when we have things like ecom or guide sites that have done a best laptops of 2020, for instance, we've recommended that they try and have a static URL, where they update that every year. If there is content, however, that they still want to rank for, rather than make a new URL for the current year, the best thing to do is actually make a new URL for the old article so, essentially, you update the current URL and you relegate the older one onto a new URL. So it's still there. If there is, for instance, different product names and things you might want to still rank for in the future, they still exist on your site, they'll have less link equity but that goes hand in hand with the lower search volume. But as Luke says, keeping that same URL with the same link equity, same links is really helpful.

Two, “Leverage retargeting effectively. Retarget to those that have not just visited a Black Friday page on your site but have also engaged in a meaningful way too. This will reduce waste and help prevent watering down the performance of your campaign. Target those visitors who visited a minimum number of pages before exiting, have added at least one item to the basket, or have tried to use a discount code, for example. Equally, you can increase efficiency by cutting out customers who've already converted, unless you believe they're likely to do it again during your Black Friday deals.” I think retargeting is super important for Black Friday so even if you've got retargeting data on customers that have previously checked out earlier in the year, if you've still got them cookied it can be a great way to pull them back in for your Black Friday offers. I'm a big fan of at least removing people for a few days after they've purchased. Retargeting is an interesting one because, when I when we talk to potential clients about retargeting everyone's like “oh, yeah, I know what retargeting is, I bought this product and literally an hour later I'm just seeing ads for it for the next week.” and, in my opinion, that's retargeting that hasn't been done so well, that's so ‘in your face’ that it becomes this conscious thing. You want to try and get some cadence to retargeting to make it appear at the right time, not so it's just stalking someone around the internet.

Luke's third tip, “make sure any discount codes are case insensitive. This one's really simple, and it's a huge rookie mistake, but so many online retailers fall victim to it. It's important that any discount codes you're using are not only case insensitive, so lowercase mixed or uppercase entries are still valid, but also that discount codes are easy to remember. In most cases, around half of online shoppers in the B2C space are using mobile devices, so making a code that's easy to re-enter and doesn't cause frustration when fighting autocorrection. This will help improve conversion and lower checkout abandonment. Keep codes simple, short, and easy to remember - ‘black30’ or ‘save50’, for example, instead of ‘bftvdeals2020’. Lastly, if you're including them in emails, make sure they're selectable as text and not just an image; not everyone wants to read your emails in html and, of course discount codes and images can't be copied easily.

Four, “Have your own discount codes page to reduce affiliate commission payouts during the big event. It goes without saying that, for many retailers, product margins are eroded during one of the biggest sale events on the calendar, more so with Covid-19 in the mix too. To reduce the amount of margin you're hemorrhaging during this period, it can make a lot of sense to have your own page for valid discount codes. By having your own discount code page ,you're reducing the amount of traffic that's going to affiliate websites like hotukdeals and myvouchercodes which often are taking a cut of your sale whether the code works or not.” I think this is a great bit of advice all year round. Lots of ecom clients we work with, we suggest they have an active coupons page. You don't necessarily need to link to it internally if it's codes, ideally you want different email or specific customers to have but, as Luke rightly points out, if you have a coupon code box on your checkout, you're very likely to trigger an action of people going away to Google what that coupon code is, and if they can't find a working coupon code it can cause frustration, actually sometimes lose you the sale. By having a page yourself, if someone's Googling your brand name and vouchers and coupons or whatever, you can rank and say, “look, these are the valid codes that we've currently got ranking”. If you don't have any codes at the moment you can literally say “we don't have any voucher codes that are working at the moment”; you could even remove the coupon code box from your checkout. That's quite an interesting thing that I have seen happen which is, as Luke says, a lot of these sites just bulk list old codes because they just want the clickthrough, they want the affiliate commission, and it doesn't really matter to them if the code's there or not.

Lastly number five: “Consider extending your returns window to cover those Christmas gift purchases.” It's a pretty good tip, whether it's e-commerce or not, Luke says “it's no secret that many people shop and buy items on Black Friday to give as Christmas gifts, so why not remove a potential blocker for your customers by extending the returns window of eligible items until late January? That way your customers can confidently browse and buy your deals knowing, if there are any issues, they've got until after Christmas to return or exchange it.” Another brilliant tip, go read Luke's full blog post on Black Friday. He's got some examples of how some of the bigger brands have prepared as well, so there's lots to learn from there.

That's everything we've got time for this week. I've got some really exciting guests coming up over the next few weeks. To get these people together in one place, I've got one recording scheduled for 11pm GMT, but it's going to be worth it. We're going to have some really, really great people on the podcast soon going through and giving their insights. Please do subscribe, check back in, and have a great week. We'll be back next Monday, which will be the 23rd of November.

More from the blog

Get in touch

Please call us on
+44 (0)1603 957068

or email
[email protected]

Alternatively, if you fill and send this form we will get back to you shortly:

Artboard Created with Sketch.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.


Thank you for your enquiry, we will take at look at your request and get back to you shortly.