In this episode, you will hear Mark Williams-Cook joined by Rob Lewis who...
Or get it on:
In this episode, you will hear Mark Williams-Cook joined by Rob Lewis who will be discussing the big "match changes" that are happening to all Google Ads accounts, how the Google Ads algorithm has been slowly changing, what you can do to effectively manage accounts with these changes, future predictions for Google Ads and Core Web Vitals report delays in Google Search Console.
Mark Williams-Cook tweet RE Google Search Console Core Web Vitals testing https://twitter.com/thetafferboy/status/1361742983884865539
MC: Welcome to episode 99 of the Search with Candour podcast recorded on Friday the 19th of February 2021. My name is Mark Williams-Cook. Today, I'm going to be joined by Rob Lewis, where we'll be discussing the changes to Google Ads match types and the impact that's going to have on managing Google Ads accounts, as well as some of Rob's predictions for the future. We'll also be talking about Google Search Console Core Web Vitals testing and some nuance there that you might not be aware of.
Before we kick-off, as always, I would like to tell you about Sitebulb, which is our very kind sponsor for this podcast. If you haven't heard about it before, Sitebulb is a desktop-based SEO auditing tool for Windows and Mac. It's something I've used personally and my agency has used for years now. It's an incredibly popular bit of software for good reason. It's central to a lot of the work we do, especially the more technical SEO work. Every week I talk about one of the features that I like because they're pretty good at bringing out new features so there's always something to talk about and just things they've been improving.
Today, because we're going to be talking about the Core Web Vitals, I thought the performance was a nice topic to pick, and that's something that the Sitebulb auditing tool will actually cover for you as well. So when it comes to auditing sites, one of them I guess you can call it kind of a mistake or maybe not to the level I would like it to be when I've seen some audits when they cover performance is essentially they just dump maybe a screenshot of the Web Core Vitals results or some other speed tool, which just basically says, "Yeah, you've scored badly. You need to fix that," and it will list some very generic things that may or may not be causing that issue.
What I like about Sitebulb is it starts to give you more detailed insight about the things on your site that are affecting performance. Performance, I would caveat it with there are very few SEOs, myself included, I think that can really go deep, deep into performance as a topic because it's actually really complicated. It's almost like a never-ending thing. It covers loads of different specialisms from knowledge about front end stuff, back end stuff, about actual servers, server configuration and management. Performance is super complicated so I always encourage SEOs to talk to developers about it, get their point of view and be humble certainly when you take these suggestions to them.
So the thing I like about Sitebulb is I ran an audit just before recording this podcast, and in the performance, it pops up with the basic things that we'll find on online tools like images are too big, they've been rescaled, stuff hasn't been minimised, but it goes the extra step to look at things. It's given me feedback that the document object model, the DOM width is too wide so that's not optimal for browsers. It talked even about the caching settings. So, the cache is set to private on some of the resources. So that's not going to be optimal. These kind of nuggets of advice are really good places to start the conversation with developers.
So while I understand most of the stuff that it's fed back on, I still like to go to the developers and just say, "Hey, can we have a conversation about this? What does this mean? What do you think about it? Can we fix it?" I find that so much more helpful than sometimes the advice you get from the more generic online tools that are easily discarded. So, that's just one of the things I love about Sitebulb. They've got a lovely deal for you listeners of this podcast, which is if you go to sitebulb.com/swc, you can get a 60-day free trial of Sitebulb, no credit card required, all of that lovely stuff that you could get. Just go try it out. Please do.
Core Web Vitals is something that I want to start this podcast on. It's a really interesting topic. We're approaching now the deadline that Google set, where they said it's going to become a ranking factor. I'm seeing one of the expected conversations about, well, how much of a ranking factor will it be. Some people thinking it's going to be significant. From what I feel like the majority thinking, it's going to be maybe quite a small impact. Certainly, I think a lot of people got burned by the mobile get an update, where it was widely believed there's going to be a massive impact on mobile-friendly sites and it didn't really turn out to be true. So I think SEOs are a lot more cautious now about crystal ball gazing.
There are a few things I'd like to talk about though with Core Web Vitals, and that's how we measure them. If you want to know more about the actual metrics, if you go to the show notes at search.withcandour.co.uk, you'll find links to our previous episodes where we've gone into depth about the three Core Web Vitals metrics and why they were picked and a bit more about them. Generally, they're picked because they are quite generic metrics and they apply across any website. But some of the details that maybe people not so involved in SEO don't know is the different ways that they're measured. This is two main ways which are what can be referred to as lab tests, which are these one-off tests where you just run a tool like Lighthouse, and you will get the results that may differ if you run them immediately again, because they're dependent on your browser, your internet connection, because you're running that test locally.
And then there's the data you get within Google Search Console, which if you click on the information arrow, it will tell you that it's from what's called the CrUX, which is the Chrome User Experience Report, which is essentially aggregated data from real Chrome users. So you're getting feedback as to actually the speed that they're experiencing. Why this is particularly important as a side note is because while there are benchmarks set for what is a good, medium and bad experience with these Core Web Vitals, they're not objective in that just because if you run a lab test and it measures them all as green doesn't mean that's what your users are experiencing. So, if all of your users are from a part of the world where the internet connection speeds on average are much slower if they run the same test, they might get yellow or red results. So, that data in Google Search Console is extra helpful, because it allows you to align your performance with your actual audience and that goes into the whole section of internationalisation with websites and SEO and experience.
So internationalisation isn't just a language and a cultural thing, but it is a technology thing as well. So if you had, for instance, three or four markets that your website appeals to, and one of those markets geographically has a lot slower internet connection on average, you actually might want to make that version of your site a lot lighter so you score better. But the point and what I wanted to talk about was a question that I had to go to Google with because I didn't know the answer. Someone approached me with their Core Web Vitals results in Search Console and asked the question, "Is the data we're seeing here when it marks a URL as good or bad or whatever, is it 28 days old?" The reason they asked that is if you fix something in the Core Web Vitals and you want to validate that in Google Search Console, the validation button will basically say that we will start to monitor things from this point. Once we're happy after 28 days of collecting that data, we will mark this off as fixed. So their question was if the data in Google Search Console is this field data, does that mean it takes 28 days to actually see those errors appear in Google Search Console? Because that would make sense, right? Because if it's taking that long to validate, that the issue is fixed. Surely it should take that long to actually detect it in the first place. So I brought this to Google and their answer, John Mueller kindly answered saying, "Yes, it's based on field data so it takes that long to populate the report. The populated data is then used for alerting. It's probably a good idea to automate monitoring with lab tests so you can catch issues/unexpected changes early." So that seemed like a pretty clear answer.
So I'm understanding that the reason why there is this delay is Google just wants to confirm it is actually an issue across a time range and across a spread of different users, because as I said, it can be impacted by individual users as well. So I imagine the confidence level in the amount of data they need will be different depending on the website. However, the point still stands that it's not instant, and therefore, John's advice is good, which is it's probably a very good idea, if you don't already, to set up some kind of automated monitoring on your sites for Core Web Vitals. So if something does get changed and it does trash your score and it trashes the user experience more importantly, that you become aware of that as soon as possible so you could fix things a lot quicker. I thought that was an interesting point. It seems with the people I talked about it as well, it hasn't been widely thought out and discussed. So, I put that out there for you.
This is something I actually wanted to talk about a week or two ago, but I needed Rob with me really to do it properly, and that is the actually very significant update to Google Ads and how their matching works. So most people will be aware when you set up Google Ads campaigns, there's various types of matching, phrase, broad, exact that allows you to put your targeted keywords in and based on a set of rules, they will or won't show your ad. And that's really important because we know because of the diversity of how people do searches, it would be absolutely impossible to try and define every possible combination of words that people would type in. But this has been the core of how Google Ads and previously AdWords worked. But now, Rob, as I understand it, there's been some pretty major changes, right?
RL: So, I'll just read out a snippet from their announcement. They've said, "Starting in February 2021, phrase match will begin to incorporate behaviours of broad match modifier to simplify keywords and make it easier to reach relevant customers. With this change, both phrase and broad match modifier keywords will have the same matching behaviour and may show ads on searches that include the meaning of your keyword." They then go on to give some specific examples. And then later on in the update, they have confirmed that in July 2021, the ability to create broad match modified keywords will be stopped.
MC: So to me, that reads like a big update, right?
RL: Yeah. I mean, on the one hand, it is, but this update, for anyone that's been managing Google accounts full time in-depth, they'll know that these fundamental changes have been causing all sorts of problems for at least a year. I would say for the last 18 months, they've been slowly broadening up the match type close variant algorithm if you will. So I'm finding the statement quite misleading because it reads on the assumption that match types function in the same way that they have done for years, which they haven't. So for those that don't fully understand the different match types, I might as well just go through the three main types really, which are exact phrase and broad or broad match.
So starting with exact match, exact match is quite misunderstood nowadays. It's assumed that if you use an exact match that the search query will more or less exactly match the keyword that you've input. We have some exceptions to various word, ordering the word, ordering may change. Or if the meaning, if the user search is the same as some of the words in your keywords, it may use a synonym, the person may have used a synonym in their search, exact may allow this to come through. So in short, people tend to use exact match to avoid Google broadening up too much and to avoid close variants from generating clicks and spend. Now, that's the understood definition of exact match, but actually, for the last 18 months, that's not been the case at all. Actually, if you read the fine print under exact match, Google says, "Ads may show on searches that are the same meaning or same intent as the keyword." So essentially, Google decides if the intent or meaning is the same, and that decision is removed from the user, in most cases, from the advertiser.
MC: That's actually something that you pretty much predicted. I remember quite a long time ago when we spoke, you were talking about this shift away from advertiser-defined keywords to Google just deciding what the intent was and showing the ad regardless of what you're targeting keyword-wise, right?
RL: Yeah. It's funny that you said that, the prediction, because I actually have some bold predictions to make at the end of this podcast, which I'm-
MC: I can't wait.
RL: ... looking forward to vocalising. So exact match, I think people assume that the traffic is safe under an exact match, but it hasn't been, and I'll give some examples in a little while as to why. So the next step up on that is phrase match. Historically, the phrase match would work if you bid on a term. I'll use the example red shoes for here as a keyword, if your keyword was red shoes as a phrase match, it would capture any search where red shoes were mentioned within the user search query. So if someone said, "Where can I buy some nice red shoes in Birmingham?" That would be captured under phrase match because the term red shoes was used, regardless of what the user typed in before or after.
Again, we're talking about how the match types worked several years ago and it hasn't been like this, again, for at least 18 months. And again, if you read under the fine print, it goes a step further than it's exact match find print. It says, "Ads may show on searches that include the meaning of your keyword." This is the great bit, "The meaning of the keyword can be implied and user searches can be a more specific form of the meaning," which is a very dubious statement there. But I suppose two years ago, they'd already started boarding up phrase match and they made it so that the word order can differ. So if you put red shoes as a phrase match, it would say shoes that are red would also be captured, for example. It's a close variant, but it's close enough and I think advertisers can work with that. But again, as I keep saying 18 months ago, I don't know the exact date, but the algorithm has gradually been broadening up and broadening up on all of the various match types.
So I guess this brings me to broad match modified, which is what essentially they're going to remove and this is what phrase match will turn into. So first of all, I'll talk about what broad match modified used to be. Okay. Because it was such a powerful match type. It used to work by... You'd have your keyword phrase, you have your keyword, which consists of multiple words. If you put a plus sign before each word, you're essentially telling Google that the user's search has to include every word that you've put a plus sign in for it to show, and it would include misspellings and again, it would include some synonyms, but it was one of the most powerful keyword match types that pay-per-click managers would use because it allowed you to anchor onto individual word combinations of your choice.
So if you had a really traffic heavy campaign spending thousands and thousands per month, you could craft your own various flows of traffic that are built around the user's intent and other various subtle variations of their search, just allows you to curate your entire flow of traffic into the campaigns and where your spend will be directed based on, I guess, pivots. I tend to call them pivot words. So there were certain search words that I pivot onto that dictate the performance. I may want to avoid certain pivot words, or I may want to focus spend on those pivot words, and the broad match modifier historically was perfect for that. But again, for at least 18 months, the broad match modifier has basically been working in the same way as its standard broad match. So, let's not confuse broad match with broad match modified.
Broad match is basically the broadest match type you can have. So if you had red shoes, for example, was your keyword, the broad match could pull in things like nice trainers or cheap shoes or plimsolls, things that actually you're not wanting to direct your users to. It makes it very difficult to direct users to the best landing page. If you're selling red shoes, you want to send them to the red shoes landing page or a page where there are lots of red shoes. You don't want people who are looking for plimsolls or green, I don't know, green ballet shoes to arrive on your red shoe page. So, this is really important in terms of giving control to the advertiser and where you ultimately decide to direct people to. As I say, broad match, not broad match modified, broad match was the broadest type.
Now I've gone off on a bit of a tangent there, but just to summarise broad match modified has not worked like broad match modified of the past or a long time, and it has worked in much the same way as broad match. So essentially when Google say they're going to remove broad match modified and turn phrase into broad match modified so that it would work the same, what they're actually saying is phrase will become the broadest method of traffic generation, i.e. broad is what they're saying.
MC: Wow. So it's been quite a few years since I've personally managed a PPC campaign and the nearest update that I was aware of around this topic was, and this was a long time ago when they changed exact match. I think they started calling it closed variant, which is, I guess, the seed of these changes when Google had decided that, "Hey, you guys are being stupid because you're missing out on loads of traffic because you're blocking it off with this exact match, so we going to help you out and spread it." I had no idea that... So I read this announcement and my assumption was that everything had pretty much been the same. So it's actually quite a surprise to hear you say that it looks like we've been slowly moving towards this anyway. So I guess Google has been experimenting, seeing what it does to their revenue, probably as well, seeing complaints they get and just working, seeing if it works. Now that, I guess, they've determined that it's pretty much there, that they're flicking the switch on, it seems, right?
RL: Yeah. Yeah, absolutely. I think in a previous podcast, we discuss close variant and I gave an example, actually, funnily enough, on an exact match keyword that was bringing in really broad traffic. The example was on one of my clients, who's a niche training provider and they provide training courses. In this example, they got hit off with just a few days worth of traffic that had flown through via the exact match keyword where it didn't even say courses. The keyword mentioned courses and training course and learning, online learning, but the search query completely emitted that and just went for the broad description of what they were teaching.
Now, it's really important as a pay-per-click manager that you only drive qualified traffic. I'm just using training as an example here. There are millions of different examples I could give. But if your client provides training courses, you want to qualify that user, you want to make sure they're in the market for a training course and that they are actively searching for a training course.
It's not like SEO where it's free traffic and you can afford to raise awareness and the top part of the funnel. With Google Ads, you want to experiment with different parts of the funnel, but you want to ultimately have control over the step that they're in, in that funnel. Ultimately, the cheapest and best lowest cost converting keywords you're going to have is the one that describes the product and the need of that person, and that person wants a training course.
I'll give an example with a music training course. Let's just say you offer guitar lessons online. You're going to want to bid on the keyword guitar lessons online or guitar courses, e-learning guitar courses. You're not going to want to bid on really broad keywords like music training, or even the broadest possible keyword, which Google will bring in nowadays on an exact match, music, new music, music ideas. That takes the control away from the advertiser and adds a massive black hole into their spend.
MC: I remember, I think it was from, again, previous episodes, one example that sticks out in my mind was when we were doing a campaign, I think for a cafe, and Google, it started targeting... Was it the word pub? Is that right?
MC: I'm pretty sure it was pub.
RL: It was pub.
MC: Obviously, they're kind of similar things, but in terms of somewhere you want to go, they're completely different. You wouldn't say, "Let's go to the cafe," and then go to the pub or maybe disappointment, but actually be the other way round. If you said you'd go into the pub and you end up in the cafe, but that to me was a huge difference and I saw that was on exact match. I don't hear this talked about a lot, which is this role, I guess, of PPC managers to almost track the algorithm like SEOs are doing. Is that a thing that you pay a lot of attention to?
RL: It is. It's not something I read up on though. It's something that I experience. There have been lots of changes to the algorithm over the years. I think one of the most fundamental changes that I experienced that I don't see people discussing it online, but it totally flipped everything on its head and changed my entire optimisation process. This was probably around four or five years ago. This change started to occur. But back when I first started managing AdWords campaigns around 10 years ago, the only way to explain it is that there is a constant flow of traffic going through Google search. I mean, obviously, there would be because there's a constant flow of searches taking place. Now, what Google, what had AdWords allowed you to do was to capture that flow of traffic as it occurred, and you would use the match types to latch onto the site to the type of traffic that you want.
In my mind, I always use the analogy of a flow of motorway traffic. There are different types of car, different types of traffic and the match types in the keyword you chose would allow you to choose the flow that you wanted and show a billboard to them as that flow went through the search, if that analogy makes sense. It makes sense to me in my head anyway, but that's how I always looked at it. And so, you could always show your ad when that traffic flowed through your keyboard, as it happens, as long as you know that your quality score is high enough and your ad was relevant enough. But what happened a few years back is that that changed and they made it so that the keywords that you added there had to be a significant amount of search available for that keyword in order for it to be able to go into the auction.
This was a bit of a problem back when... some of my old e-com clients I used to manage, where there was this technique that we used to do where we'd export the entire product feed and convert the product titles into different pivots, different keywords based around the product title. We'd use broad match modifier on the individual words that built up and describes that product so that you'd have this massive campaign built around the different variations of the product title, and they would capture the traffic flows and they would convert really low cost per conversion at really good ROI because you were showing ads to users that were searching for specific product-related searches.
But then they changed that, and you could no longer capture and control the various flows or words, individual words, instead, it's changed over time gradually. I've noticed the biggest change to this algorithm in the last year, I'd say, is that it's now you put a keyword in and Google will decide whether that keyword is relevant and Google will send some traffic that it thinks is related to that keyword. No longer are you actually in control of the flow of traffic and when the ad shows, Google has control of that. But it looks at the hints. It looks at what that keyword you've inputted in is and how it thinks the user search relates to that keyword if that makes sense.
MC: Yes, got it. I mean, I guess we're building up to the point at which you need to give us some answers on there. Well, I guess, two things, how can PPC managers, business owners, again, adapt to these changes? Because certainly, there's nothing we can do to fight them. We just have to play the hand with doubt. You said earlier, which I'm really interested in there, you said you've got some other bold future predictions. So, what are they?
RL: Well, first of all, in terms of how you can mitigate wasted spend and improve performance, I can only say what it is that I've been doing and I've found for the last year or so. There's less attention on my end in terms of match types and keyword placement and it's more about campaign sculpting. It's how I tend to refer to it, which is where you look at the search queries that are coming through and you add carefully placed the negatives to avoid certain flows of traffic and to then direct that flow of traffic that converts lower, but that you still want into another campaign. So you carefully placed negatives in the campaigns that you want to separate from other flows of traffic. So it's essentially the same way of match types and keyword placement. It's just it's the other way round. It's about blocking certain flows and redirecting them to other campaigns.
I find that that's something that pay-per-click managers have had to do with shopping campaigns for a long time, because you can't control them. You don't use keywords on shopping campaigns. You can only look at the search queries that come through and add negatives, but I'm finding I do that now with just standard search campaigns. This is becoming more difficult over the last few months because the latest major update that Google announced was that it was hiding a percentage of search query data from pay-per-click. So it might be that there's really, really broad traffic coming through on the campaigns that you don't want, but you just don't know because it's being hidden. So the one thing I can suggest is go back to the basics and do some keyword research around your core keywords and find out what keywords Google was suggesting are relevant. Anything that you think is irrelevant, export it, import it into a negative keyword list as an exact match and make sure that Google can never waste your money on those keywords that it deems irrelevant, but actually are not.
MC: I think the other thing we mentioned because we spoke about that when Google did that change was even spending a bit more money in places like Bing, because you're still getting the keyword data from there, right?
MC: So you can sometimes see those additional negatives that you might want to add, but your analogy really makes sense to me. So this is basically saying you're sculpting because you are getting it to the final form you want by removing things rather than building it exactly how you might want it.
RL: It's funny actually, because when I first started as a pay-per-click executive years ago, my main job was adding negative keywords. And then when Google offered a bit more control with broad match modifier keywords, the needs to add negative stopped for years. Of course, you always have to check the search queries and add negatives, but it's like I'm going back to my roots now and just a lot of my optimiSation time is adding negatives, stopping Google from spending that, I don't want that. Shopping campaigns actually, I should note, have become particularly broad. I've had cases recently where shopping ads are being clicked on when people have just done a Google search with colour, but the colour red. They've shown an ad and it's for a red jacket, for example, that I'm setting, and yet Google have decided, "You have red. They've said red, so let's show it."
So, I'm having to add exact match queries of colors, numbers, shoe sizes for clients that are... I've got clients that sell footwear. People were just typing in searches like, "What is a size nine in USA?" Hasn't even mentioned the products and the specific type of footwear. It's gone really broad. So having to create really in-depth negative lists around these things. So yeah, there's a lot of keyword negative adding to be done at the moment, and that's my main recommendation is to keep a careful eye. What I often do is I'll look at the search term query report for my phrase and exact match keywords and I narrow down on the close variant search terms and just have a look and see how broad it's going. I think you'll would be surprised what kind of broad close barrier matching Google is applying at the moment.
I've got some predictions. So my predictions for this year and beyond are to expect Google to start hiding more search query data. That's not a surprise. They've already started doing it. They're going to probably do it even more. Dare I say it maybe one day, in the not too distant future, they may completely block the search query data and site privacy reasons.
My second prediction is to expect Google to start ignoring negative keywords. I have a bit of a hunch that maybe they're already doing this on the hidden search query, but how do you prove that? That's just a theory that I have in mind and I'm monitoring it, but I don't know for certain. But I imagine at some point they will decide to ignore negatives, because that negative keywords are our one major control at the moment to avoiding wasted spend because they've removed the match query, the match type approach.
I'm also expecting dwindling pay-per-click performance to be a main reason for Google to prompt you to move towards smart campaigns. Because if they're removing the control for you to be able to improve performance, their main reason for you to move over to smart campaigns is for better performance. But the only reason we may be having better performance now of smart and automated campaigns is because they're removing manual control from us. The other closing thoughts I have are don't neglect other channels. So you've already mentioned Bing. I know Facebook is a bit in a state of flux at the moment in terms of IRS changes and other things that are going on. But don't neglect Facebook and Instagram, explore other channels, even Pinterest, and also listen to your pay-per-click manager, not what Google says because ultimately your pay-per-click manager wants your campaigns to perform really well and maintain a good close relationship. Google just wants to take your money at the moment, it seems. So, those are my closing thoughts and predictions.
MC: Well, I have learned a lot there. I think so, too, to sum up Rob's analogy there, imagine your pay-per-click manager is this artist that's sculpting you this lovely model out of clay and Google's come along and slapped all of the tools out of their hands and then just whack their hand into it and mashed it around a bit. And then they say, "Well, look, they're not doing a very good job. Why don't you let us sculpt it for you?" It's pretty much the analogy for the removal of various tools, controls and data we're seeing with Google Ads.
RL: You said it so much more eloquently than I could have. It's a perfect analogy.
MC: But I think it's a really interesting point though about not many people discussing Google Ads algorithm updates, or it might be actually that I'm listening to the wrong people because as you know, I'm mainly in SEO-centric groups. So I might have a little ask around and see if that's something people are talking about. But thank you so much, Rob. I learned a lot from chatting to you about this.
RL: You're welcome.
MC: So we will be back in one week's time, of course, which will be March the 1st 2021, and it will be episode 100th. So yeah, I feel like I should do something special for episode 100, but I have nothing planned yet. Maybe, I don't know, I'll do a special intro, a bit of music or something, or maybe something not quite as lame. I don't know. I'll have a think. But I hope you enjoyed the podcast. I hope you enjoyed the insights, especially Rob's detailed insights on the changes in Google Ads. If you enjoyed it, of course, subscribe, all that lovely stuff, and I hope you have a lovely week.
In this episode, you will hear Mark Williams-Cook joined by Rob Lewis who...
In this episode, you will hear Mark Williams-Cook discussing site monitoring...
Get in touch