In this episode, you will hear Mark Williams-Cook talking to Simo Ahava and...
Or get it on:
In this episode, you will hear Mark Williams-Cook talking about:
December Broad Core Update: The facts of the update so far and some interesting EU guidance that may compel search engines to be more transparent about such updates
Brands in PAAs: Insight into how some brands are being negatively impacted by PAA questions in SERPs
Listener Q&A: Your questions on titles and h1s, search intent, and migrations answered!
New EU regulations compel more transparency for search rankings: https://searchengineland.com/new-eu-regulations-compel-more-transparency-for-search-rankings-344482
Brands in PAA results: https://twitter.com/jroakes/status/1336433928761118721/photo/1
MC: Welcome to episode 91 of the Search With Candour podcast, recorded on Friday the 11th of December 2020. My name is Mark Williams-Cook, and today I'm going to be talking to you about the Google December Core Update and some interesting EU guidelines that have been brought in, around what search engines have to tell businesses. We're going to talk about an interesting thread on brands and brand perception in PAAs, so that's People Also Asked results, that I'm always interested in, and we've got some listener Q and A we'll be going through.
Before we kick off, as usual I want to tell you that this podcast is kindly sponsored by the people at Sitebulb. Sitebulb is a desktop based Mac and Windows SEO auditing bit of software. Every episode I talk about some features that I like about Sitebulb, because we've used it in the agency for a long time now, I've used it for a long time. It's a brilliant bit of software, I use it on basically every site that I do SEO. One of the first things I do is run a Sitebulb report to give myself an overview, and I'll talk about each feature on each episode, but they made it nice and easy for me because they've just released, on December the 8th, version 4.6 of Sitebulb and it's got a brilliant new feature, which is ‘response versus render’. What this means is Sitebulb can now do a comparison of the raw HTML response you get versus the rendered response. Up until now this is something I've always done with plugins in Chrome, to quickly side by side compare the raw HTML versus the rendered document object model.
I thought we would kick off in reverse order today because why not? We're going to start with the user Q and A for the podcast. I've picked three questions that we've had that I'm going to answer. I picked them very quickly; I haven't actually sat down and planned out answers to these, I just think they're interesting questions. I just want to talk through them and talk through my thinking of them. If you think I'm wrong, or if I miss something, let me know, find me on Twitter or LinkedIn, and I'll certainly add anyone else's opinion to these answers.
The first question is from Manish Bhickta and he says “If the Google SERP starts showing a mix of results of commercial and information landing pages, what is the intent of that query... which page do we need to create for that? Also what's happening if Google Ads are showing at the top for that query?” I think this is a really good question because that's something, as SEOs, we tell lots of clients to focus on, and certainly in-house, I know people work on it, which is understanding the intent of a search as opposed to just thinking about the keywords. Part of the strategy that, hopefully you'll be forming as SEO teams and agencies and in-house and wherever you work, is Googling the type of things you want to rank for and seeing what type of results Google is giving.
An example I use a lot of the time is, with how-to queries, you'll very commonly get video results at the top because it's normally easier to watch someone do something than read about it and try and imagine it. That's the case where Google has decided that “okay, I think users want to see a video for this query”, it means that, if you write content with the hope of ranking for that query it's going to be unlikely that you're going to get the top position, because content isn't just about what you write, it's about how it's delivered as well.
This question about, what do we do when Google is showing a mix of, say, e-commerce and informational pages, is well actually we just need to accept that, just because there is one search term doesn't mean there is one intent behind it, because everybody is different and the same search term may mean different things to different people. Where we see this clear split, so Google might be showing half commercial transaction pages and half informational pages, it's because Google has realised, maybe from their click data etc, that the intent here is split, so some people are wanting further information and some people are wanting to buy. In answer to the question, “what should we be creating?” it would be, if you can, I would do both and maybe combine them onto a single page.
I actually had this discussion with a client a couple of weeks ago, when we were doing some SEO training, and they were asking me about how much content do they need on category pages for their e-commerce site? They said “okay, well we're under the impression we need to write two-three hundred words on this category page to get it to rank.” We had this discussion about it because I read the category text out to them and we came to the conclusion together that it was waffle, it wasn't actually any useful information. Then, when I started asking them “what questions do your customers actually ask you about these products?” then they came alive and they had lots of answers. We said, well actually why don't we use that content for the category page because it is genuinely helpful, it's answering people's questions that they ask about those products and we're not just writing content for the sake of it. I think there are some circumstances where you can mix that informational and commercial intent together.
There's a lot of user experience things, as well, to keep in mind, so you want to keep an ecommerce site still looking like an ecommerce site, and you certainly don't want to hinder the ability for people to browse and shop by making it too heavy leaning into content, and there's lots of user testing and workshop ways you can make sure you're doing the best possible job there. The simple answer is you can cover both bases, the only side note I would add to that is, what i've seen in some situations is the type of informational sites that rank, sometimes, are very non-biased review sites and then nothing to do with e-commerce sites. If that's the case, then I would take that into your planning as well and say, well look, Google has for whatever reason made the decision that only these types of dedicated sites to writing information are reviewing our ranking here, so maybe we should focus on what we're good at, which is the commercial side and selling and supporting these products. There's actually another way you can get there, which is you actually outreach and get involved with those websites, get them to review your products and you can give your opinion, and actually help and get visibility that way. I hope that answers your question, Manish.
The second question we have is from Tommy McMaster, and it says “What are the best practices around maintaining SEO rankings during a site redesign or migration? Any useful plugins?” I assume you're asking about a Wordpress site because you mentioned plugins. There's a huge answer to that question, because there's lots you can do with migrations. What you've touched on there, which is “during a site design or migration” is, there are different types, if you like, of migration. Lots of things that you change site-wide, can affect rankings. The most obvious is a URL migration when, maybe you've rebuilt the site and existing content will appear on new URLs. You've got domain migration, that's when the whole domain will change, whether the actual design changes may or may not be the case.
A site redesign, certainly in terms of internal linking and architecture, if that's migrated and changed, that can affect rankings. The design as in, how the actual content is laid out and displayed on the page can affect rankings as well. Even things like infrastructure changes, server changes, can have an impact as well. It's worth bearing all these things in mind because all of them have a hand to play. It really comes down to weighing up the risk, in my opinion, and that's the risk of what you're potentially losing during these migrations. If you're working on a site that does millions of pounds of revenue every month, through organic traffic, you need to be very careful about any changes you make. You probably don't want to do a domain migration, and a site redesign, and rewrite all the content at once, because if something does go not according to plan and goes badly, it's very difficult to then actually unpick which one of those things that you changed caused those issues.
With larger sites, even basic things, like title tag changes, if you're rebuilding the site, we've advised clients to do this in phases. Let's migrate the URLs and then see how that goes, give it a few weeks to make sure our traffic is stable and everything's how we expected. Then we'll execute the changes of the titles, make sure that's fine and has the desired or expected outcome. Then do the next thing etc, etc. So, the first advice would be, if it's a really big site to, do it in stages.
Secondly, you'll hear different things from different people about what to expect in terms of traffic loss during a migration. If you're moving domain or moving URLs, even if you do everything correctly, I have seen sites temporarily lose traffic. If you're gonna forecast that, I tend to personally do it on unbranded traffic because, unless you've done something horribly wrong, it's very unlikely you're going to lose rankings for your brand's name, assuming you haven't got an incredibly generic brand name; normally it's the unbranded type of traffic that's at risk and, as a rule of thumb, my line in the sand is, if you're losing more than 20% of your organic traffic, immediately after a migration, then something's gone very badly. If kind of the traffic loss is lasting more than 12 weeks, then that raises warning signals as well.
I've done migrations, I've been involved in over 100 migrations, I've seen sites lose about that before and then recover fully, I've seen sites have absolutely no real impact in traffic during a migration, and I've seen sites, within a couple of weeks, actually have net big positives. The biggest mistake I see on migrations, when they're handled internally, is people forget to redirect the non-canonical URLs. That's URLs your site might have where, for instance, you've got really great links from external sites that maybe have marketing parameters in the URL, and that means that, if you just do your redirects based on the internal URLs, those old links you've got coming to pages that might be very important to contributing to some of your rankings, and then just 404ing and they're not feeding into your site. Same with bringing along with you any old domain migrations, and everything that's happened in the past, it's always worth digging around in a tool like Majestic or ahrefs, to see what links exist, where they're from, and make sure you bring them with you. Loads of advice, there's loads of guides online about site migrations, but there's definitely a planning, a testing phase there, and the redirects are obviously the main thing.
The last question is from Niall and he asks “I noticed of late that some SEO tools, like the SEMRush site audit, whine about page titles and h1s being identical, is there some specific issue with such an implementation?” So is it a problem that page titles and h1s are identical? I think that's an interesting question; my immediate reaction is, I haven't heard of any specific issue with that being an implementation. I certainly don't think it's an issue, in that, it's a problem. I would say, if anything, it could possibly be an opportunity. We have to bear in mind these auditing tools not only find problems with sites, they sometimes lump them together with opportunities, which I would class as two different things. An issue is, “you are doing this wrong or it's not best practice”. I'd say you're doing this wrong, so it is actively harming you, whereas maybe something like this is an opportunity where you're not doing this wrong, there's no reason you can't do that, but you might find it more beneficial if you did it in a slightly different way.
Two interesting points related to this are, from the Search Off The Record podcast I was listening to a couple of weeks ago, they were talking about rendering and indexing, and one of the interesting points was around rendering the main content of a page and looking at things like the font size and the h1 size, and we've known this for a while, but it was I think sideways hinted at again that, for instance, if you had h1s and h2s on a page but you applied the styling so they were not differentiated from normal paragraph text, that Google might actually discount the importance of that h1 and those h2s. It's actually looking at how the user sees the page and obviously, if the text is bigger, whether it is an h1 or h2, it's hinting that that's maybe a more important kind of guide to what's on the page.
Now, the general practice I've always used in terms of page titles and h1s is, it is an opportunity to essentially use two similar phrases to say the same thing. I keep in mind normally that, with the page title, you're more constrained with the length that's visible in the SERP, so try and keep it short and snappy and trying to match roughly what we think the user's going to be targeting, whereas the h1 is is really practically the title for the user on the page because, if you think about when you load a web page, it's very rare that you are reading the title that's in the browser tab at the top. Normally that's truncated heavily anyway, because browser tabs are tiny and it's all the way up at the top there, so the title almost for the user is that h1 on page. I tend to use that as a long-winded, more detailed version of the page title, which means we can target different keywords. I don't think it's a specific issue. No, I think it's an opportunity, I would agree it's an opportunity and not an issue, so it might be worth looking at.
Breaking tradition with previous episodes, I am going to talk a little bit more about the Google Core Update. Normally I avoid talking about them, especially early on because there are sometimes a lot of things that are said, that turn out not to be true. There are some useful facts that have surfaced about the December Core Update that I think are worth sharing. One thing that lots of people observed is that we did see this huge change on the 4th of December, and in the subsequent days it seemed that the changes weren't so big. However, Google today, so that's the 11th of December, have confirmed that those changes are still rolling out, the December Core Update hasn't finished being applied, and actually we have seen more bigger shifts this morning as well. So it does seem to be coming in a couple of waves but, the fact is, it's not finished rolling out yet.
Something interesting that wasn't on my radar and I discovered this, so Glenn Gabe had mentioned this, and I actually found it because Lily Ray, who we had on the podcast a few weeks ago, had mentioned that Gabe had mentioned it, so I found out eventually, is that these core updates regularly impact Google Discover traffic as well, so it is worth checking in in Google Search Console on your Google Discover traffic. I don't have so many clients that we have good amounts of Discover traffic, so that wasn't really on my radar, if it wasn't on your radar either that's something to look out for.
The third thing that I think is worth mentioning is this update was big. By big, I mean in terms of breadth and depth. Breadth, I mean the previous core updates we have seen, I think, this pattern where it's been mainly affecting certain sectors, and this update does appear to be more across the board, so whichever SERP monitoring tool you look at, something like SEMRush gives this 0 to 10 volatility across different categories, there's about 20 odd categories and all of them are above 9 out of 10 on desktop, and mobile is very similar with, I think, only one, which was real estate, came in at 8.7. Obviously they're compound metrics that SEMRush uses on a 0 to 10 scale but I think they're useful in terms of comparison to previous updates.
We haven't seen that much movement across all sectors for quite a long time, so I think there is this breadth to this update, and secondly, in terms of depth, and by depth I mean the actual impact. I've seen examples on Twitter of people complaining they've lost almost half their organic traffic, and we're talking about significant amounts here. On the other side of the coin some people have had huge uplifts in visibility so far, as well. To me, this update is interesting, I've noticed it on a few of our clients as well, on both sides of the coin. We've had clients gain and lose some traffic as well, nothing quite as extreme, fortunately, as other people in the community, but I think all of those things are worth considering in context to this update. It is broad and it does seem to be having this really big impact.
The only specific case I've seen affected was, in the Cistrix write-up, where they specifically showed that dictionary and lexicon type sites had lost a lot of visibility, and this is something that had again slipped through me noticing it, which was, in October, the Google Quality Rater Guidelines had given this quite specific view of what Google wanted search results to look like, and mentioned about these dictionary type sites not being quite as important as they used to be, because Google's very good with those instant answers, as we know now.
Out the back end of this, as we get with every update, we had people complaining to to put not a fine point on it, about how Google's guidance on these updates can be very vague, and that it's difficult, especially for small businesses, to sometimes understand why they were building a business off the back of organic traffic, they're hiring people, they're buying stock, they're getting premises, whatever it is, reinvesting money into their business as you do, only to have kind of the rug pulled out from under them very quickly and with no explanation of how to get those rankings back.
This brings me on to an article I found on Search Engine Land written by Greg Sterling, which is about some EU guidelines concerning search engine transparency. I had a read through this article and well done, Greg, for covering this because it did involve delving into some rather dense and heavy EU regulation documents. I'm really appreciative that someone else did that and gave a summary, which I'll share with you now.
What Greg's pointing out is, there were some articles that were published in 2019, from the European Parliament, that were around promoting fairness and transparency for business users of online intermediation services, intermediation services being things like marketplaces, like Amazon, and search engines, so those tools that connect us up with the businesses and services that we need. There's some new guidelines, that were published this year, that further bolster those original regulations. They seem to be quite specific around search engines, so Greg's kindly written up a plain language description of these, which I'll read out to you.
Firstly, these guidelines say they're not legally binding, which does leave me with the question about how Google will adhere to them, but they're designed to help facilitate compliance with article 5, which is the 2019 stuff I just spoke about, which says in the relevant part “Providers of online search engines shall set out the main parameters which individually, or collectively, are most significant in determining ranking, and the relative importance of those main parameters by providing an easily and publicly available description, drafted in plain and intelligible language on the online search engines of those providers. They shall keep that description up to date. These ranking factors can be presented in different places, the guidelines recommend a single touch point, for example, in a user dashboard, that could reference or index all the relevant informational tools available to explain ranking transparency. Regardless, the information can't be buried in terms and conditions, it must be found in an easily accessible location on the online search engines web page. This may be an area that does not require users to log in, or register, to be able to read the description. As indicated, the discussion of ranking parameters should be presented in plain and intelligible language, although in some cases it may be more for technical professional users. Ranking variables but not algorithms must be disclosed. Article 5 and the guidelines also say that search engines and marketplaces are not required to ‘disclose algorithms or any information that with reasonable certainty would result in the enabling of deception of consumers, or consumer harm through the manipulation of search results’. Accordingly, they need to enumerate the key variables or considerations that determine rankings, but not the algorithms themselves. However, search engines and marketplaces are obligated to ‘describe the relative importance of the main parameters.’ Some hypothetical ranking parameters provided by the EU include; page loading speed, security (e.g. https) images (e.g. type number quality), consumer reviews (e.g. number rating and recency), trader consumer interaction (e.g. answered queries responsiveness), dispute settlement history (e.g. number of consumer complaints/solutions found), offline service quality indicators (e.g. hotel star rating, delivery performance, the degree to which places, brands, etcetera are familiar or known in society, data protection score (e.g. based on reviewing the privacy policies of apps by an app store), web accessibility, content quality, keyword tagging, title accuracy and relevance, and concise answers, for example, as regards products or services offered, or in response to FAQs."
I think it's really interesting, perhaps a little bit of a naive request, in terms of, I think, Google's answer can almost rightly, in a way, be “it depends”, in that they've certainly provided this information clearly for things like Google My Business, so there is literally a bullet point list of things you can do to increase your visibility within Google My Business. The guidelines, as they have been for many years, for organic search are a lot more vague and bigger picture. We've certainly got a lot of clues from the Quality Rater Guidelines but, when it comes to this specifically requesting not just the parameters, but their weight in ranking, as far as I know, a lot of the weights are calculated anyway through machine learning algorithms.
I was listening to Gary Illyes just a couple of days ago, talking about canonical signals and how it's very difficult to manually adjust the weight of one signal, because there are knock-on effects, so if you think okay, well, we should be listening to this signal more closely, so you manually kind of up that dial, it means that by the nature, when you make that signal more important, you make all the other signals less important, which then can have unforeseen impact on ranking, so you go back and say okay, well it's because we're now not treating this ranking signal as important enough, so we'll manually tune this one up. Again, as you can probably see, you end up with this endless condition where you're just chasing one factor after another, or several at once. One of the solutions that they've got is just defining the intended end outcome and letting machines do those adjustments.
I don't know how much we're going to see come out the back of this but I do think it's really interesting that the EU is trying to get some transparency from search engines, especially other search engines like Amazon, which many businesses have had great success, and suffered greatly as well, at the hands of, it remains to be seen what happens with that over the next year.
Finally, I'm going to talk about People Also Ask results - PAA results. As you probably know, I'm especially interested in PAA results because we run the alsoasked.com tool, which maps out these results, and I find them incredibly interesting. There's a thread by JR Oakes on Twitter and he's put “subject negative brand impact of PAA results”. I'll just read through this thread and give some commentary on it, because I do think it's really interesting. It's something I've spoken about before in terms of brands protecting themselves, in terms of if they have People Also Ask results. Google has been including more balanced questions into People Also Ask results; here is a selection where I think they go too far. This selection was found in seven searches, at random, for companies that came to mind. What he's done here is a branded search and then highlighted one of the People Also Ask questions that google is showing.
For g2.com, there's a question, “is G2 Crowd legit?” If you did a search for Apple, there are questions “why is Apple so bad?”, “is Apple really worth?”, oddly worded... and “why is Apple so expensive?”. For rei.com, there is “Is REI worth the price?”, Is rei.com legit?” and Pinterest, there is a question “Is pinterest a safe site?” What he's pointing out here in the second reply to his thread is, many of these are leading and loaded questions which can influence the user's perception of the brand. “I should mention that I altered the results to remove the ads in GMB for a clear review.” Very honest of you to point that out but, point stands that he is saying these these questions change how you think about the brand when you Google them, because you probably weren't thinking these things when you were doing that original search.
It goes on to point out “the issue is that many of these searches are done by a very small fraction of users yet make visibility to all the brands prospects and customers the seed of doubt on some aspect of the company. You can have 30 searches influencing brand perception for millions of people. We know that results are not driven purely by search interest, based on several examples we have seen from major brands with similar results. In many cases the negative questions are asked only by a handful of users over a six month period. Further, in most cases, tens or hundreds of other question intents have much more user interest in search.” The next couple of replies about how he's looked at this more broadly, got a list of fortune 500 companies from 2019 and then run some sentiment analysis using different models on the PAA questions, and the results were, with two different models that he's used, 63% of PAA questions for companies were overall negative, and the second model, so same data set, as far as I understand, but a different model to judge, was that 80% of the questions could be judged as having overall negative sentiment. That's pretty significant whichever set of results that you take.
He goes on to say “here's a look at two separate brands with arguably the same product; one has a reflection of web content which reinforces, in my honest opinion, an unfair difference between the two products. They're both really not that great for you.” The phrases are this: “Why is Coca-Cola not good for you?” “What does a Coke do to your body?”, “Can Coca-Cola kill you?”, “What is Coca-Cola's net worth?”, “What brands does PepsiCo own?”, “Is Pepsi owned by PepsiCo?”, “Who owns Pepsi Cola now?”, and “Does Pepsi own KFC?”
His point here is Pepsi and Coca-Cola are arguably very similar drinks, and Coca-Cola's really leaning towards those very negative questions about ‘can this drink kill you?’ and Pepsi is more about the kind of ownership of that brand. I will link to the rest of this thread because he goes on to give some more analysis about the types of companies, their revenue, and some other examples which are too long just to read out. I think it's a really interesting thread to get into.
You can find this in the show notes at search.withcandour.co.uk - I'd recommend anyway, if you're doing SEO, you do follow JR Oakes, he's just literally at JR Oakes, j-r-o-a-k-e-s on Twitter. A really interesting person, always posting great stuff about SEO. If you want to learn more about PAA and strategies, I'll also link in the show notes to some analysis that was done by SEMRrush and that analyzed, I think it was a million keywords they had as their initial data set, and as well they had some looks into brands and PAA results there. Either way, if you are working in-house or agency side for a big brand, it would probably be good, if you haven't already, to review what's coming up in those results.
That’s everything we've got time for in this episode; I will be back on Monday the 21st of December and this will be our last episode of 2020. I'm hoping to get some guests on again, like we did last year, so we'll have three or four people and we'll have a chat about how things have gone in 2020, at least in terms of SEO, and what we think is going to be important for the new year. It's always fun to make some predictions and then, I imagine I will probably have two weeks off from the podcast, maybe back on the 11th of January with episode 93. I hope you'll subscribe, join us for the final show before Christmas and I hope you all have a great week.
In this episode, you will hear Mark Williams-Cook talking to Simo Ahava and...
In this episode, you will hear Mark Williams-Cook talking about; December...
Get in touch