Candour

The latest Google product review update, tabbed meta descriptions, SISTRIX’s new Live Data feature, updates for Google Tag Manager and RIP to the URL parameters tool

Or get it on:

What's in this episode?

In this episode Mark Williams-Cook and Jack Chambers discuss:

  • The latest Google product review update
  • Tabbed meta descriptions in SERPs
  • SISTRIX’s new live data feature
  • Important updates for Google Tag Manager
  • RIP to the URL parameters tools

Show notes and links

Transcript

Jack: Welcome to Episode 12 of Season Two of the Search With Candour Podcast, recorded on Wednesday the 30th of March 2022. My name is Jack Chambers, and I'm joined by the one and only, Mark Williams-Cook. And today, we'll be talking about the latest product review update from Google, tabbed meta descriptions in search engine result pages, SISTRIX's new live data features, updates for Google Tag Manager, and the descent of the URL parameter tool.

Jack: Search With Candour is supported by SISTRIX, the SEOs toolbox. Go to sistrix.com/swc if you want to check out some of their excellent free tools, such as their SERP snippet validator, on-page analysis, hreflang validator, page speed comparison, and tracking your site's visibility index. That's sistrix.com/swc for free tools. And sistrix.com/blog for all of their regular blog posts. And we'll get into some of the new features for paid SISTRIX users later on in the show.

Mark: We've got a Google update.

Jack: Yay.

Mark: So this is one of the Google updates that I actually like.

Jack: Oh, okay.

Mark: So for older SEOs, SEOs that have been doing it a while when we used to have Google updates. Google used to be a bit more cool and tell us what the update was looking at and what it was focused on.

Jack: They would give you a bit of a heads up and a bit of a warning, right?

Mark: Yeah. And even if they didn't, they'd be like, This we looked at. This was a link space one, or this was this. And nowadays, we get this line generally if we've done a core update, which does a bunch of stuff. I don't know if that's because maybe the system's algorithms, whatever you want to call it behind Google have changed so much that it's less, Okay, we need to adjust these levers and dials to make the search results better. And it's more now, we got this fee back and the machine did a thing and we don't really know what the machine did, but it's testing better.

Jack: Skynet has taken over and it's in control now.

Mark: Yes. I don't know how much of that is actually genuinely the truth where they have sometimes that, because I guess they do have certain parts of the algorithm that are made with machine learning models and they do some stuff based on the inputs and outputs that they want. So they can't necessarily do that. I think they probably could, but some of the issue, I think sometimes is, obviously, as soon as they give the SEO community any type of hint as to what may have changed, the laser focus goes on that. And it's almost detrimental if we know that. But the reason I said, I like this update is because there has been another Google update announced a week ago at the time we're recording this, so the 23rd of March, and it is the latest product review update. And I say latest because we had a specific product review update almost a year ago in May 2021, which we covered. We'll link to that in the show notes. So if you go to search.withcandour.co.uk, you'll see the transcription and show notes for this and every other episode. And it was back in Episode 110. We covered some of the winners and losers. And I think actually in 109, we talked about the update and 110 was, we went through some winners and losers of this product review update. Now, circling back around to why I like these updates is because we know roughly what they are about. And actually, this is one of the rare instances with these kinds of updates where Google has given us actually very specific prescriptive advice about this is what we are looking for. And it is very rare that you get that advice.

There's one pinned tweet now for core updates, which they just refer everyone to, which is like "Yeah, we changed some stuff. You can't really do anything about it. So if you lost some rankings or it's probably not. It's not that you've lost rankings, there's someone else who has gained them and we just re-ranked." The kind of stuff that no client ever wants to hear or no boss ever wants to hear when they're like, "What are you going to do about it? And how are you going to fix this?" So I'll read out the helpful part of this post. Again, we'll link to the post, which is a bit longer. It's got the Google top and tails, but this is what I pulled away as the important bits.

They said: "Our first updates were designed to, among other things, help ensure reviews come from people who demonstrate expert knowledge and firsthand research about products. Today's update builds on this work to make sure that product reviews in search meet certain criteria, such as, one, include helpful in-depth details like the benefits or drawbacks of a certain item. Specifics on how a product performs or how the product differs from previous versions."

Number two: "Come from people who have actually used the products and show what the product is physically like or how it's used."

Three: "Include unique information beyond what the manufacturer provides, like visuals, audio, or links to other content." That's interesting. "Links to other content detailing the reviews experience."

Fourth, lastly on this list: "Cover comparable products or explain what sets a product apart from its competitors."

There's a whole bunch of things I think you can dig into there and think about why Google is looking at these things specifically.

Jack: I think some of them are more self-explanatory than others. I think a couple of things I'd like to pull out in there is the benefits or drawbacks of a certain item. And the fact that drawing on that honesty side of things and looking at and we've touched on this a lot with the AT and things like that, the trustworthiness of things that Google is now taking into account. You can't just say, we are the best because we are the best. This product is the best because it's better than all the others. Being honest and saying, actually this competitor is better than us in this way, but our product is better in this way is a more correct way of doing it and a better way of doing it for users and from a search perspective as well it seems.

Mark: That's something we actually picked up in that Episode 110 I mentioned earlier. We picked up some winners and losers from a qualitative study Mordy Oberstein did, who's at Wix again. And one of the things he picked out was that the losers were the kind of affiliate pages that were just hyping up a product because you've got this collision of motivation if you like. So you've got the search intent, which is, I have whatever problem this is, and I need a product to solve that problem. And you've got the affiliate motivation of, I want you to buy the thing that I get paid to sell you. So again, that ties into what you said, I think, about the drawbacks. Though it does show a degree of honesty if you're saying, well, this is good for this, this isn't maybe as good. And you're giving this whole picture.

And it links in to what guidelines Google's provided here as well in terms of covering comparable products. So that again is thinking about the intent, which is all the products have a purpose. The intent is actually just to solve that purpose within some criteria price, et cetera, or brands. You again are probably, and all of this, it comes down to the probability of how likely you are to be the best answer will be a better answer if you've given that comparison.

There's some quite... They seem obvious, but when you think about it at an algorithmic level of what Google's trying to work out some really interesting things here, which I would look at if I'm producing a site with reviews or have reviews on my site. Actually, putting together some specific guidelines, a checklist saying, have we covered this? Have we covered this? Because this is one of the rare instances where Google has given us a good list of the things it's looking for. So to me, if you would like to rank in search and you are not doing those things, that's pretty wild because you've been told what to do, so just do it. And related to this or following on from this, I saw we had Dr Marie Haynes did a little tweet thread, which was her very early results. Again, winners, losers type things she's seen from this. So I'm just going to read this out. So again, I'll link to this in the show notes. Dr Marie Haynes says: “Several sites are seeing big changes across pages that recommend ‘best products.’ This site is an affiliate site. It ticks all the boxes in Google's recommendation, so it links to multiple vendors, has pros and cons, helpful user reviews, in-depth product info, firsthand use.”

And I read this and I was like, Oh yeah, that's maybe again, a linguistic thing that Google's picked up from a machine learning point of view of, Oh, if we're comparing lots of products or comparable products, it's very likely that the word best is going to be used in a summary. I then see this tweet and I'm like, Uh oh, because I know how SEOs are going to react. Like, We need the word 'best' on every review and every template, which I think misses the point. Again, Marie is saying, this is very early results. This is going to be, at best, a correlation, but it's interesting to see where these things are being picked up. She goes on saying: “I should add that links to multiple vendors is something Google recommends but was not a part of this update just yet. And it'll be part of a future one.”

And she's quoted here, a couple of tweets by a chap called Alan Kent, who is an e-commerce advocate at Google. And one of the things I found interesting that he pointed out, which was a reply to a question from Lilly Ray was that this algorithm update is just a boost for product reviews, but other pages would only be impacted if Google believed it was a product review. So it's not going to affect sites as a whole. It seems as far as I can make out from that limited information, that it's a page level update. There's a few more tweets in Marie's thread saying another affiliate site. They're also seeing increases for searches containing ‘best’. Content speaks of personal expertise. We tested these and analysed them for comfort, durability, strength etc. And again, I think that's unfortunately probably going to be a trick I think some affiliate marketers are going to use in how they word these reviews because lots of affiliate marketers never see the product. They just do the research. They're basically doing the legwork or the research for you and they're writing it up. If they start writing in this first-person we tested this, I did this, we compared this, I think that's the only way Google can realistically tell if that's been done because they haven't got. They're still going on just on-page information. Google doesn't have a way to verify until it gets to that creepy stage where it's like, well we've identified you in the video here and we can see which product it is, and we know that's you because of the voiceprint.

Jack: "Have you scanned the thing with Google lens using your phone?"

Mark: Yeah. Exactly. I mean, they could. I wouldn't actually put it out of the realms of possibility that there might be some verification for something like this with products and with individuals. I've certainly, I have weirder ideas, but at the moment, it's just interesting to think about that one of the only ways that they can follow up on some of these statements is literally the wording you are using to say the same thing. Changing it to, we compared this, as opposed to these products are, can vary. It might affect actually what we've been seeing here. So I won't read out all of Marie's thread here. There's quite a lot for you to dig into there. The only other thing I wanted to pick up was, again, one more quote that she had taken from Alan Kent, which was where he said: “Product reviews on a merchant website would be impacted as well. This update is relevant to content we believe is a product review, not to product pages. When we believe the user wants a review and a page contains a review, we'll try and find the best review for that query.”

Jack: That makes sense.

Mark: Yeah, it does make sense. Doesn't it? But this does say then, obviously, as a manufacturer, you could potentially lose rankings where you do have reviews of your own products to a site that is offering a review that's closer and experienced to everything on that list. Because it's very rare that a manufacturer is going to be like, Here are all of our competitive products. Here are the cons of our products. But that's what I like about this and what I find so fascinating is that this is the core of what made Google, Google, which is that they stepped away from using on-page signals. Because before Google, everyone was really heavily reliant on which page is the best one for this query. We'll just see what the page tells us. And that's the equivalent of going into a shop and asking the shop owner, "Is this a decent shop?" They're always going to be like, "Yeah, yeah, it's great. Come on in." Whereas, because Google adopted this, oh, we're going to lean heavily on the link graph or the web is a bit more "democratic." That's a bit like saying to some randoms in the street at least, is this shop any good? And that's good for the user experience and that's reflected in this product review thing because you don't want to read product reviews from the manufacturer because they're going to tell you it's great.

Jack: Again, it's touching on honesty and that trustworthiness, right? If you know someone is making that product and reviewing it themselves like you said, Mark, they're going to be, Yeah, it's great. It's the best on the market, because of course they are. Who doesn't want to sell their own products? Whereas you have a slightly different intent from users who want to shop around, look at different reviews and stuff like that. So I think it is interesting. Like you said, you might see the product pages directly from the manufacturer being affected by it, if you put review at the end of your search, essentially, you're going to be looking at people who are comparing things across different companies, different manufacturers, different brands, different sites and all that kind of thing. So I think it'll be an interesting way of judging that from the intent of the user.

Mark: Oh no. Does this mean more influencers?

Jack: Yes.

Mark: Oh.

Jack: So we touched on some SERP features recently on an episode and we've got another little one highlighted by Valerie Stimac on Twitter. And of course, previously mentioned Marie Haynes, previously mentioned Glen Gabe and previously mentioned Lilly Ray have all been tagged in this tweet. And we got going to talk about some tabbed sub meta descriptions on SERPs, which I think is a really interesting way of getting more information on the SERP. And I know we've talked about this in the studio before, Mark. Google, trying to keep people on the search engine result page rather than actually clicking through to the page. And we'll get into that. There's some information from Glen Gabe in a second about that as well.

But actually looking at how, instead of just essentially a few sentences that that typical meta description you've seen a million times before, this actually had tabbed out different sub-topics that you could click through on the search engine result page without actually going through to the page itself on the site. And they use a true crimes pod example here because everybody loves a true-crime podcast these days. That's the hot topic over the last few years. And Lily Ray touches on something saying, she's seen it before. And then Barry Schwartz, of course, you probably know him, jumps in and says, it first came to Bing actually. And we've touched on this a few times as well. Bing will often push out a cool new idea and then test it a bit and then Google follows as well.

Mark: So it is new…kind of?

Jack: Sort of, yeah. Exactly.

Mark: Wow. That's a win to get one past Barry where it's not new, because normally it's not new, so…it's new! Nice.

Jack: Yeah. But the exact quote from Barry is: “Both. First on Bing, then on Google.” So, yeah, it's on there. Going off from what Glen Gabe touched on and expanded upon, O. Christine asked on Twitter, I'm wondering if someone scrolls through this, does it count as a page view? So as I said, it does not take you through to the page of the actual search result. You are staying on the SERP. So how does that factor into people clicking through on your website and click through rates and all that stuff? And Glen Gabe clarifies something here in a reply to this. Glenn Gabe says, "Those would be tracked like any other jump links in the SERP. So the page receives an impression in search when it ranks as it normally would. That impression shows in Google Search Console. Google Analytics definitely would not trigger a page view since it has nothing to do with the page loading.” So there's a clarification for you folks. It's not linking to Google analytics because as we know, everybody has Google analytics. That's not how that works, but it counts as an impression for that specific jump link on your page. So yeah, interesting stuff that Google are doing slightly different things in the SERPs there as well.

Mark: Well, fair to say, an interesting thing that Bing are doing, and then Google's copying it and people actually noticed it on Google.

Jack: Fair, yes. Yeah. Just as is often the case.

Mark: Yeah. Was it last episode? There was the thing Bing did and Google immediately copied.

Jack: We've talked about it a couple of times already. Even in the 12 episodes I've done, we've covered it a couple of times.

Mark: I feel sorry for Bing when they do something cool like that and Google's like, "Yeah, we'll take it."

Jack: I think it's interesting because Bing can get away with more stuff because like you said, people don't notice. So I guess they have more room for testing and playing around and that allows Bing more room for testing because it has a lesser impact on the wider search world and the SEO industry and people who are trying to rank and stuff that. So they get to do the cool experimental stuff, and then Google is just like, "Yeah, we'll have that, please. Thank you very much. We'll steal your idea now."

Mark: Talking about people changing things, I'm just going to go off on a small tangent here about Google and them changing things and actually caring what anyone thinks. So we know that a long time ago, they got rid of organic keywords so we couldn't see it under the guise of privacy, but you could still see the keywords if you paid for them on Google ads because then privacy doesn't exist because you paid for it. But obviously, now in Google's steps to make everything more and more broad match, they've stopped showing a lot of search query data in Google ads. We've discussed this before on the podcast now. It just reminded me, because I saw a tweet yesterday, someone saying they'd spent, I think it was about $1,500 on Google and they only had 10% visibility of the keywords for 10% of that money that they had spent. So about $150. For the rest of that money, they have no idea what keywords generated those clicks and which generated the conversions. Because they're like, I had some conversions, but I can't make any decisions.

Jack: Yeah.

Mark: Yeah. Anyway, now that's off my chest. I found this interesting because I've given up on meta descriptions a lot. So we've known for a long time, Google said that we don't use meta descriptions in ranking, which is fine. Still important, because they can drastically affect clickthrough rate and SEO is all about getting more organic traffic, so it's important we do that. But then Google has increasingly again just been rewriting meta descriptions.

Jack: Yeah. Obviously, we had the title apocalypse last year and they do similar things for meta descriptions as well.

Mark: But yeah, they're really aggressive meta descriptions. Depending on which study you've looked at, it's 50-80% are just rewritten.

Jack: So it's 60% of titles and 80% of meta descriptions or something like that.

Mark: Yeah. Right. So there's super high numbers now. So I've never thought it was a great way to spend your time. But certainly, when I've seen people sitting there with a spreadsheet of meta descriptions, I'm like, just don't even bother because 80% of that is going to be completely useless. But if we're getting results like this, this is interesting because actually as a user, I like this because it gives me... It's like that information sent thing. So how, when you are on a website and say, you can use the menu to get a scent for what information is on that site and what they do. And you're getting that earlier here. So with this pod, I can see what some of the titles of the episodes are and without loading the site or going through that experience, I can get some idea of maybe if I like it. So it might be that in some instances where Google's showing these tab meta descriptions, it might be worth going back to tidying those up if you can do a better job. It may be that Google's just going to be generating all of this and being "No, I got it. Don't you worry about it, which is fine because it's a boring job anyway, as far as the SEO jobs go, the meta description writing.

Jack: So in that time in the show, we're going to dive into some information from SISTRIX and we're actually going to have a bit of a feature update from SISTRIX. And I know these guys have been working very hard over the last few weeks, getting the live data update out for SISTRIX users. I'm going to basically dive into the blog post. Of course, as always, there is a link for it in the show notes at search.withcandour.co.uk. And you can go read the full breakdown of basically the changelog and all the updates from SISTRIX for live data and newer features, such as the search intent, traffic prediction, and global keywords, which are now available in the main overview for your domain.

Here is the update from SISTRIX. SISTRIX are now in the process of rolling out the new live data options, covering lists and analyses. This update includes visibility index rank in temp breakdown and global data views. Some user accounts have already been switched over. You can, however, activate the feature if you are available in SISTRIX Labs. We mentioned SISTRIX Labs last week. That is basically the little testing ground for new features and stuff for certain users. So if you have access to SISTRIX Labs, you can actually activate this feature manually if you choose to. When the feature is activated, many of our results tables will be based on the new live database.

You can activate a date column in the keyword table, for example, all of the list by the most recent date, and then see immediately when rankings were last updated. So this to the minute, essentially, live data coming from their keyword rankings and all this stuff from their visibility index as well, which is a key part of that. And we'll dive into the visibility index rank here as well. And the question is, what position does a domain have in the visibility ranking?

So one of the new innovations that comes in with the activation of this live index is the display of the VI visibility index rank of each domain. So you can actually compare and control it yourself across different verticals and across different visible domains for your country. To determine this value, we sort all domains with a VI value in descending order and update this on a daily basis. So you're getting day-to-day information to give you a feeling of what VI values can be expected in a range of positions. And they give a few examples here.

So VI rank one with a visibility index of 17264.81 is of course Google itself where all the organic features are taken into account. Then going to VI rank 10, you're looking at something like fandom.com, which has a VI of 576. So quite a big drop off it's there. We started at 17,000, we're now in the 500s for fandom.com. And then going to 100, we're now in double digits. We're down to 59.62 for europa.eu.

Then moving on to the search intent, traffic prediction, global keyword side of things, these are now available in the domain overview and this is a new feature. So you can look at the global figures for organic keyword rankings and data for all of the countries that SISTRIX has analysed. You can actually change into that and look at where you're ranking in Germany, where you're ranking in the U.S., where you're ranking in the UK, all that stuff, and cover a lot of different things. You get an instant overview of the main search intent of the ranking keywords. Is it a no? Is it a do? All that kind of stuff.

I know we've touched on search intent a few times with a few other tools and now we have this in the live data and global keywords from SISTRIX. Traffic estimates also show the average monthly organic visitors per month that can be achieved from the keyword rankings. And for the organic traffic value, SISTRIX calculates this, essentially equates it to how much it would cost to buy those visitors through ads. So equating the SEO side of things to the PPC side of things. You can also see whether there is desktop, mobile or tablet traffic on the domain as well. And you can get more details as I said if you go to the link in the show notes at search.withcandour.co.uk. If you are a SISTRIX user or you want to try some of this stuff out, you can go there, click on the link, and go through and read SISTRIX's post info.

Mark: I feel we're going to be talking more and more about analytics over the next 12 months probably.

Jack: Yeah. We touched a lot on GA4 and alternatives and even tutorials for GA4 last week. So if you haven't already listened to that and you aren't aware of what's going to happen with universal analytics, I recommend you go back and listen to last week's show. If you are somehow living under an SEO rock and haven't paid attention to the fact that universal analytics has stopped being supported as of next year, July 2023. So yeah.

Mark: But if you are interested in Google Analytics, you have probably come across, I guess what some people consider the godfather of GA, which is Simo Ahava. He's a lovely chap. He was on our podcast last year with Krista Seiden, who we mentioned in the GA4 podcast in the last episode. She (Krista) used to be the project manager at Google and Simo's just really well-known for posting excellent information about everything to do with analytics and tracking. He does real deep dives into basically how this stuff works, how to implement it, and has got a very well-founded expert reputation because of these blog posts and tutorials. So we'll link to his blog in the show notes. If you haven't seen it, then it was worth you listening to this whole podcast just to hear his name to go and find his blog. And if you've Googled pretty much anything to do with GA or Tag Manager-

Jack: “How does ‘this’ work in Google analytics?” You'll probably come across Simo's blog.

Mark: You probably found him, yeah. So I just wanted to bring this up, because these updates to platforms Google Tag Manager, which is what we're going to be talking about a couple of updates are really easy to miss, basically. And this is the point of this podcast really is, we're trying to keep an eye via Twitter and blogs and all of the different official mouthpieces from being on Google and work out what's important which is one of the hardest things of being in the industry is trying to just stay aware of everything. And then work out what's worth you learning because nobody can take all these topics, seal the changes, and be a deeper expert in all these things. But this is something that I think is worth at least everyone knowing about. And guess what, Simo has done a tutorial. And if you want to learn about the implementation, but it's... I'll read his tweet because he describes it as: "two very, very important releases to server-side Google Tag Manager. One of them is arguably the most important update to the platform in a long, long time and the other opens up a myriad of use cases for data enrichment at scale." You know it's important because he said very twice and he said long twice.

Jack: Yeah. Yeah. When you get an expert talking about a subject and they say very, very important and the most important update in a long, long time, probably worth listening to. Absolutely.

Mark: So there's two things here that are changing, which is asynchronous variables and promises, which again, if you are not... I guess it's more than knee-deep, waist-deep, shoulder-deep in analytics, probably it doesn't mean a lot to you. So asynchronous variables are basically just so asynchronous is the opposite to synchronous. It means they're happening at a different time. Asynchronous variables essentially means that you can now execute your Tag Manager code and it can pretty much wait for you to fetch other variables while it's going on and it'll come back for that. And a use case for this that we've had to do. So we haven't used this, but where we could have used it is for instance when we were doing some analytics work and we were running a pay per click account. And alongside some other behavioural data, we were trying to while we were, pulling in weather data. Because for this particular client, we wanted to try and see what the correlations were between if it was a super hot sunny day and if it was a freezing cold rainy day, like it is most of the time in England, as to how that affected their sales and which products. So we could actually look at doing various bid adjustments and campaign adjustments dependent on the weather.

And it was a little bit tricky to do because we're trying to obviously record all this user behaviour as it happens. And then you send requests off and they don't come back instantly. And this is what essentially this asynchronous variable stuff will allow you to do. It's a really easy way. Simo puts it as “enrich your data at scale”. So, basically you could take in maybe three or four different requests from other sources and variables, and now it will neatly plug into this one flow. And Promises is essentially the rapper to make this happen within your code. So you've got platforms like BigQuery that are already... The term they use is “promisified”. I guess that's a word they're using now. So they're already wrapped up and that they do this for you.

So a promise refers to the fact that when this code executes, it says, Okay, I've got to go get this variable, but you carry on doing what you're doing, because I promise you, when you come back, it'll be here and I have it. And one of the things they've got is essentially this JavaScript promise stuff is a way for when you make your code templates to make these variables in this kind of promise. I don't know if framework's the right word, rapper. There's probably a developer cringing here, but it's essentially a way to wrap up those variables and put them into this way of working.

Jack: Yeah. We've touched on APIs quite a few times recently as well. And like you said, this is a way to run multiple APIs and return earn a single promise that's resolved once all of the API calls have returned with results. So instead of pinging one, then pinging the next and pinging the other and pinging the other and getting that chain of APIs, you can just as you said, Mark, send it off on its way. It'll leave you with a promise and then it'll come back with the full results and a full chain of API results as and when you need it and when it's available.

Mark: We'll put a link as you probably guess, search.withcandour.co.uk. And Simo, of course, has written up an in-depth guide into how to implement this. So if you are a Google Tag Manager nerd, this should be top of your reading.

Jack: So let's finish off with Google sun setting another thing, shall we? Not quite as dramatic as universal analytics, but is considerably shorter notice. So I'm going to touch on what they have described as their spring cleaning of the URL parameters tool. And I'll read the post directly here from the developer blog on Google. In short, we're deprecating the URL parameters tool in Search Console in one month. Nonetheless, it's happening in 14 months in July, 2023. It's going to be gone in a month. There's no action required from current users of the tool. And when the URL parameters tool launched in 2009 in Search Console's predecessor called Webmaster Tools. I'm sure there are some old school SEOs out there, you know what I'm talking about. The internet was a much wilder place than it is today. Search and ID parameters were very common. CMSs had trouble organising parameters and browsers often broke links. With the URL parameters tool, site owners had granular control over how Google crawled their site by specifying how certain parameters affected the content on the site. Over the years, Google has become much better at guessing which parameters are useful on a site and which plainly put are useless. In fact, about 1% of the parameter configurations currently specified in the URL parameters tool are useful for crawling.

Mark: I don't get this. 1% is a huge amount of URLs.

Jack: That sounds like a lot. And we've touched on this a few times talking about search engine numbers and oh, it only affects 0.6% and then, oh, by the way, that's trillions. So 0.6% is billions, which sounds like a lot to me.

Mark: I don't get it. It says here, in fact, only about 1%.

Jack: Of the parameter configurations currently specified in the URL parameters are useful for crawling.

Mark: So I think what has happened here is that this is a call that Google's made because there will be people who get this wrong and are actively shooting themselves in the foot. And they've determined that, well, actually our automation does a better job on the whole, in the same way that that was their response to the title tag rewriting, right? Because everyone's like, there are titles, we're humans, we've got big fleshy brains. Let us use them. We can write good titles. And Google's response was, Yeah, but you're all SEO web mastery people. But the other 90% of the internet as a whole over their titles are really bad. So we're fixing all of them, so you just can't have your surgical tools or control to do that.

And I feel that's what's happening here, which is that if you're working in SEO and you've used this before, you can learn how to use this tool and it is helpful because Google does sometimes get it wrong. But I think what's happened here is that like the canonicals, probably disavowed, there are people doing more harm than good to themselves where Google's seen that people are using it incorrectly and then are snafu-ing their crawler. So they've just decided to take the toy away, take the knife away from the child.

Jack: I think it's also, and what we've talked about recently with BERT and MUM and all this stuff and how sophisticated Google is now at crawling and understanding websites in different ways. 1% of parameter configurations currently specified are useful for crawling. Google rule is essentially telling you we've evolved past this. We know how the web works. We understand how the internet works and we can just source it ourselves. Like you were saying, work with the title side of things. So they go and talk about the low value of it all and blah, blah, blah. It's going to be gone in a month. Going forward, you don't need to do anything to specify the function of URL parameters on your site. Google's crawlers will learn how to deal with URL parameters automatically. You don't even need to do anything. Google just does it automatically. And as we discussed, there's never been a problem with that before. So everything will be fine.

Mark: In fairness, I think there will be fewer problems because as they said that the nature of the web has changed. But I guess it just always irks me as a professional when someone takes away a tool that I know how to use and has helped me before because they're not perfect. So if you are not perfect, why not give us a way to fix it?

Jack: And the final little bit from Google here, they say, if you need more control, so this is looking at you, SEOs and webmasters, you can use robots.txt rules. For example, you can specify parameter orders in an allowed directive. I'm sure anybody who's used robots.txt, you know what I'm talking about there. Or use hreflang to specify language variations of your content.

Mark: Right.

Jack: Except that's a different thing, right?

Mark: Yeah. So if you can't peel this apple with this knife here's a screwdriver. But that is-

Jack: You could peel an apple with a screwdriver, but you'd probably be there a while.

Mark: Robots.txt. We're talking about allow, disallow, mainly URLs. The parameter tool was specifying this filters data, this generates new content, that's completely different. So don't try and implicate me.

Jack: I feel they're lumping the whole URL parameter thing. So thinking about search queries and all that stuff, a lot of that is what you use the robots.txt for as you're filtering out all the, Oh, we don't want a lot of the query strings to be indexed. That makes sense. But yeah, you're right. This is a different thing.

Mark: Yeah. Robots.txt wouldn't stop them, but, I mean, you're trying to stop Google crawling it, I guess. So you could argue, yeah. If you said, okay, this is just a filter, it adds no value in the URL parameter tools, that's going to signal to the crawlers. Yeah. We probably shouldn't bother crawling this, which is same, same, but different to robots.txt because I would expect in the URL parameters tool if Google's decided that actually, this stuff isn't worth crawling. If I get links to it, it's still going to count and crawl from there and count those links. Whereas if I robot off a chunk of my site, because it's designed badly. If I then get links there, they're not going to count because it's not going to be crawled from then on. So again, it was like a bit of a-

Jack: A bit of a cop-out almost?

Mark: A blunt tool.

Jack: Yeah.

Mark: Yeah. That's the spoon for the surgery.

Jack: That's the screwdriver for the apple.

Mark: We've taken the scalpels away, but use the spoon to remove the appendix.

Jack: Have you ever played Surgeon Simulator? That's pretty much what you end up doing.

Mark: I have, yeah. Is there a second one now?

Jack: Get the big hammers and stuff. Yeah, there is.

Mark: Is it VR? I think it says VR.

Jack: It's full VR, yeah.

Mark: Right after, we're going there.

Jack: We'll bring that into the studio audience. You don't get to play, but we will.

Mark: Live URL.

Jack: We're turning Candour into a Twitch streaming agency and just we'll become influencers in the end.

Mark: Can you imagine watching a Twitch stream of someone using the URL parameter tool?

Jack: In VR?

Jack: So that's all we've got time for this week, but I'll be back next week, which is Monday the 11th of April with an interview with the one and only Claire Carlile. In the meantime, if you're going to BrightonSEO later this week, please let us know, and say hello. I know Claire is going to be there as well. Hopefully, we'll get a chance to say hello to Claire. And yeah, if you're attending BrightonSEO, Mark also has a talk in the main auditorium as part of the keyword research segment on the Thursday at about 16:10 in the afternoon. And me and a couple of other Candourlorians will be hovering around, doing stuff, hanging out, doing karaoke stuff, the usual things that happen at conferences and stuff. So yeah, if you do see us come and say, hello, lovely listeners. We'd love to meet you all. And yeah, tune in next week for my interview with Claire. Thank you for listening this week, and we'll be back next week.