Candour

SEO news: Crawl budget optimisation, site accessibility and continuous scroll on desktop

Or get it on:

Show notes

This week, Jack Chambers-Ward is joined once again by Mark Williams-Cook to discuss the latest SEO news including:

Transcript:

Jack: Welcome to episode 48 of season two of the Search With Candour podcast. My name is Jack Chambers-Ward and I am joined once again by the one, the only…Mark Williams-Cook.

Mark: I feel like I need to add that after you gave it the one, the only. Yeah, it's just me. I'm good. Thank you. Good to be here again. Lovely to have me back.

Jack: Two shows in a row.

Mark: I know.

Jack: And the live stream, three shows in a row if you want to be picky. We have got an assortment of SEO and PPC news to be discussing this week, including:

  • Google clarifies a few things about crawl budget optimization
  • How to conduct a site accessibility audit
  • Continuous scroll is now available on desktop in the US but it's getting there, and
  • The latest helpful content update has begun rolling out.

Search With Candour is supported by SISTRIX, the SEO's toolbox. Go to sistrix.com/swc if you want to check out some of their fantastic free tools such as their SERP Snippet Generator, Hreflang Validator, checking out your site's visibility index and the all-important Google update tracker. Since we're talking about Helpful Content updates, it's key to keep an eye on those Google updates. We'll actually be digging into some data from the new knowledge graph tool, which we touched on recently on the show later on in this episode. Should we kick off with some Google news, Mark?

Mark: I love talking about crawl budget optimization.

Jack: I know. Isn't it the sexiest of SEO topics?

Mark: It's the Christmas table discussion 101.

Jack: Earlier in December, on Friday, the 2nd of December, Lizzie from the Google Search team updated the crawl budget management help document with two more myths. Essentially, you have that little true or false myth selector thing. You can basically do a mini quiz at the bottom of the document and they have clarified a couple of different things. Point number one, Google added that using no index isn't a good way to control crawl budget, but it can be a method to indirectly free up crawl budget in the long run. Make sense, right?

Mark: It does.

Jack: Cool. Number two, pages that serve a 400 status or 4XX, anything, except 429 do not waste crawl budget. I think that's the one a lot of people have kind of clung onto a bit, because there's been a lot of debate about 410s and 404s wasting crawl budget and things, if you have a bunch of them on your site, is that affecting your crawl budget and apparently not, which is nice.

Mark: Yeah, that makes sense. 429.

Jack: 429.

Mark: 429?

Jack: Don't know what a 429 is.

Mark: Too many requests.

Jack: Oh, okay.

Mark: The reason that would affect crawl budget I assume is because Google will slow down crawling-

Jack: Right.

Mark: ... if it thinks it's hammering your server too much, which is another reason why the conversation we had last week that they don't use the crawl delay in robots.txt. They're pretty clever with their dynamic speed of crawling. If they think they're crawling too much and it's debilitating your website, they'll pull back a bit. That makes sense to me. Again, no index, crawl budget, 404s, a lot of these things get merged together, but I think you've got some detail on Lizzi explaining why those things are true.

Jack: Yeah, exactly. Expanding on point number one, any URL that is crawled effects crawl budget, fairly straightforward, and Google has to crawl the page in order to find the noindex rules so we're diving into a bit of noindex. However, noindex is there to help you keep things out of the index if you want. That seems self-explanatory, but it does need clarifying something-

Mark: But then nofollow links are actually followed by the crawl.

Jack: There you go.

Mark: It doesn't always do what it says on the tin.

Jack: Exactly. If you want to ensure that those pages don't end up in Google's index, continue using no index and don't worry about your crawl budget. It's also important to note that if you remove URLs from Google's index with no index, otherwise Google bot can focus on other URLs on your site, which means no index can indirectly free up some of the crawl budget on your site in the long-term.

Mark: Which is the exact reason why you cheeky monkey SEOs using no index follow won't work. I've seen lots of people before, they don't want a page index, but they want the links on there to count because they've got links to pages they want to rank. But obviously, we follow the logic back here. If Google is saying you are going to buy a second tier like impact, save crawl budget, it's because those pages have been noindex for so long. Google's stopping, bothering, crawling them, therefore the links on those pages aren't going to count, so they're treated the same as if they had nofollows on them. That's quite neatly wrapped up in the technical explanation of why you can't keep Page Rank flowing through noindex pages in the long-term.

Jack: Excellent. Of course, slightly expanding upon the second point about the 400 status codes, pages that serve 400 HTTP status code except the 429 we've just talked about, don't waste crawl budget. Google attempted to crawl the page but received a status code and no other content. Again, touching on there with people trying to do sneaky links and things like that.

Mark: Yeah, absolutely. I mean, the only thing I'd add to that is you've seen people saying a 410 can remove pages from the index faster than a 404. So, 404 was not found, which could be an accident, 410 I believe it's called removed, basically showing you've intentionally killed that URL. But again, if it's 400 except for 429, it's not affecting crawl budget. In general, unless your site is big, and by big I mean at least in the six figures of pages, generally crawl budget isn't going to be an issue. The biggest issue that we've talked about is the whole discovered not indexed where Google crawled not indexed.

Jack: The bane of my existence.

Mark: Which is yeah, we found the page just don't like it enough to index it.

Jack: We'll get there eventually. Don't worry.

Mark: Maybe.

Site Accessibility

Jack: Next up we have an article on Sitebulb written by the fantastic Sam Torres from the Gray Dot Company. You may also know Sam Torres from hosting the opinionated SEO Opinions Podcast, which I highly recommend. Really, really comprehensive article here. I kind of picked it out because it caught my eye and I feel like accessibility is something we need to talk about a lot more in SEO in general. I was not prepared for just how comprehensive and interesting and how much I learned just from reading this article. Pretty much as soon as we came into the studio, you said the same thing, right Mark?

Mark: Yeah, so when we talk about accessibility, what springs to mind, at least in my mind, is stuff like the very basics we've spoken at for years, like you should include alt tags on your images. Then there's the SEO-focused stuff around anchor text. Anchor text is a really interesting one because anchor text comes up with internal linking and therefore comes up in SEO conversations a lot. One of the things I like to profess when we're teaching SEO and talking to clients about SEO is the parallel that most changes you make onsite will have a positive impact for users anyway.

I think anchor text is always a really great example of that because firstly, we know it helps search engines. For any user, it helps because it gives them a context to where they're going to go when they click the link. But again, when we use the example of screen readers, reading out links, it's much more helpful than click here, click here, read more. But yes, made that comment about this really great article from Sam Torres, because normally when we talk about accessibility, there'll be maybe something about colour contrasts, alt tags, things like that, but this is covering literally visual impairments, auditory impairments, speech impairments, cognitive learning, neurological impairments.

I haven't seen anything pull everything together this before that's had this breadth of information or hasn't been a really dusty internet standards document that's really hard to actually understand and engage with because you're trying to decipher various RFC codes they're using to describe a standard. Haven't come across anything quite like that, which is surprising, and as you say, maybe for the SEO industry and at least a big chunk of the web is a little bit damning for we're not spending enough time thinking and talking about it.

Jack: Yeah, absolutely. You mentioned screen readers earlier on as well and the kind of technology behind gaining accessibility for your site and trying to understand how users are using your site. Sam also dives into a lot of that as well, talk about including refreshable braille displays, which is something as a person who is not visually impaired, wouldn't even occur to me, head wand and mouth sticks, motion tracking and eye tracking technology, alternate mouse and keyboard devices. I know a lot of people, a lot of gamers have their gaming-specific keyboards and stuff, but the way people can transform how you actually interact with your computer for people with different impairments, whether that's cognitive or physical is fascinating to me. I think that Sam does a really good job of highlighting and really pulling out some interesting examples on ways you can help users with different interfaces and different ways of accessing your site, gain that ability to understand your content and read your site in an important way. I remember it was about two or three years ago and I was getting into working with clients and stuff like that and there was a moment where I switched a screen reader on for the first time and just closed my eyes and tried to understand a site and it was a site I knew really well. It was a client I'd been working with for a while. I felt like I know their menu, I know the nav pretty easily, all this kind of stuff. I was lost very quickly. It was like sites, menu, one. It was like item one, and it just scrolled through a bunch of stuff. I'm like, "I wasn't really paying attention to that. Oh my God, please read that again." It was such a kind of shock for me to be like, oh wow, this is a completely different experience. It is not just a little kind of seeing, like I said, it was a site I was very familiar with. I thought I could navigate this with my eyes closed, turns out I couldn't. I'm just tapping through each menu item. That's something I recommend. As a little test, go onto a site you feel like you know well. You know the main menu, you know the navigation, all that kind of stuff. Just hit the tab button a few times, see what order the buttons highlight in and where it all goes, because that's kind of what a screen reader is doing is understanding where the links are, where the buttons are and how everything interacts with each other. Soon as you see those big dropdown menus, you feel like, yeah, don't worry about it, it's big dropdown, it's got 10 to 12 things in. Get ready for all 12 of those things to be read out loud to you in order and you've just got to remember that it was number six you were looking for. It's like, "Oh yeah, right. I need to go to number six now." That was such a jarring experience. But have you ever done that before, Mark, and just experienced the site as a visually impaired user would?

Mark: Yes. I've seen visually impaired people use websites and using that technology and the thing that surprised me was the speed at which they can set the text to be read back, which was frighteningly fast because I guess anything, it's another skill that you acquire in terms of like you said, yeah the first time you do it, you are trying to listen to what comes next while memorizing the bits that might interest you and where they were. But obviously, when you become experienced at doing that, the speed at which they had content read out to them and then go back was incredible.

It made me think sometimes I listen, I think like you do sometimes to, well you do it with audiobooks, which I don't agree with. I think it's barbaric, but I listened to podcasts on normally one-and-a-half speed, and if it's tutorial videos, times two speed. I was like the amount of practice they've had with that technology, I mean that must even feel slow. I couldn't imagine the torturers listening to a tutorial on times one anymore, Because even I get there watching something, I was like, "Come on, speak faster."

Jack: Yeah, definitely. I think it's really interesting and like I said, if you haven't done any of this, I highly recommend just going and trying a screen reader, trying to use a different interface, so use a different device to interact with your site. If you know this is something you want to dive into, and like I said, I think we all need to understand this, me as an essentially able-bodied, able-mind person, I find it fascinating to see this side of things and understand how different users do it and hopefully make the experience easier for them. Like you said Mark before, one of our golden rules of SEO we always talk about is whatever you're doing for SEO should never negatively impact the user experience and that applies for visually impaired users, cognitively impaired users, people using different interfaces and devices as well.

If you haven't already, definitely go and check this out. Like I said, links for all of these in the show notes of course as always at search.withcandour.co.uk. It is an extensive and comprehensive article. Mark and I haven't really done it justice giving a little summary here. Highly recommend you go and check that on Sitebulb's blog because it is a fantastic guide to understanding site accessibility and basically auditing it because Sitebulb also does a bunch of this stuff for you and is able to pull out, once you tick the right boxes and choose the right options for your crawl and understand what Sitebulb is pulling out, you can then understand where you are falling behind or maybe where you are succeeding in terms of accessibility for your site. Highly recommend you go and check that out in full in the show notes.

We touched on the knowledge graph tool from SISTRIX on our live stream the other week. Should we dive into some data, Mark? I felt like I kind of whipped through it a little bit, but now it's out there on SISTRIX Labs. It's being tested by, I've seen a few chats about it in the SISTRIX Slack as well, a few of us testers being able to play around with it and have a think about it and stuff. Should we discuss some knowledge graph stuff? Because I feel like it's something I've not particularly, none of my clients as far as I know are... I tend to work with smaller brands and stuff like that. I've not really got much of an experience with big international brands that have these knowledge graph kind of things. I think it's interesting to have a look at.

Mark: We've had some chances to play around, put some different websites into it and I think it's important for everyone to start exploring knowledge graphs even if they're not in your day-to-day at the moment with SEO. This is a conversation I've been having a few people this week with. You've probably seen lots of people playing with GPT chat and making it do some cool tricks and some people saying this is potentially a Google killer, a Google competitor.

Jack: Oh God. Like TikTok was three months ago.

Mark: You can see why when you know start using it, but the main difference I think between a tool like GPT Chat and Google is GPT Chat is a language model. It's basically been trained to essentially do tricks with languages because it doesn't actually know anything. Before the podcast, we were discussing some different examples where it would, you'd ask it a question and it would give you what appears to be a factual answer and then you question it about that fact but then it gets the fact right. It hasn't applied that knowledge. What Google has and what we're getting into now with this knowledge graph is it has an understanding of what things are and the connection between those things, so if you can get the other half and plug in a language model to that, you might have something a bit more interesting.

Jack: Weirdly enough, this is a perfect segue for next week's episode where I talk to Sara Tajer about entity SEO and we really talk about knowledge graphs and understanding how Google builds this network of topics and items and nouns and brands and all this stuff in different companies. Stay tuned next week listeners, Sara and I will be diving into that in a lot more detail. But before we get to that, let's talk about some knowledge graph stuff. I briefly touched on it on our live stream last week and essentially, stick a domain in and you get some details about the knowledge graph around this domain if it has one. Let's list them out and then we'll have a think about what that means and how useful, because I didn't quite understand some of it at first and Mark and I were discussing this before we started recording and then realized like, "Oh right, that makes sense. Actually, that's really useful." Because again, I've not really optimized for knowledge graphs or particularly looked into it in much detail. I was kind of, my thinking went in one direction and turns out it's in a very different direction but a very useful one nonetheless.

When you put a domain in, you can see a table from SISTRIX and it will lay out the name. That's the name of the knowledge graph entry, and that is not necessarily, but sometimes is, the main keyword, which is something to bear in mind because again, I think intuitively, you would assume the main keyword and the name of the knowledge graph would be the same thing. That is not necessarily always the case and something to bear in mind when you are using this. Also, if it doesn't have a clear name or SISTRIX is unable to pull that from Google, you will get the Google internal ID allocation for that thing as well. Also bear that in mind. The top keyword is the next piece of data and that is the keyword of this domain that shows this knowledge graph panel with the most search volume. The name and top keyword are often closely related. They could be the same thing, like I said, the main keyword could also be the name of the knowledge graph, but they are not necessarily the same thing.

Mark: Just given a theoretical example of this, so I don't know if it's an actual one, but if you had a knowledge graph entry for say McDonald's restaurants, it might be that the top keyword is just McDonald's because that's what people type in, but it's related but it doesn't necessarily have to match.

Jack: Exactly. The top URL, so similar to the top keyword, this is the top ranking URL from that domain that you have input into.

Mark: That's outside of the knowledge graph. It's just appearing in the SERP, which is, I thought it originally might have been that that's the domain referenced within a knowledge graph because some knowledge graph citate sources, but this is just literally on the SERP.

Jack: Correct, yep. Then it's the amount of keywords, so this is the number of keywords. This URL shows the corresponding knowledge graph panel four and clicking on this number takes you right to a pre-filtered keyword table. This is again what I had assumed was this is the total number of keywords for that domain or for this particular URL. This is for this ranking URL, the one Mark just mentioned, this top URL, that is the number of keywords that show the corresponding knowledge graph panel. Like you said, Mark, you might get the same knowledge panel for McDonald's and McDonald's restaurants and McDonald's Norridge or whatever it is. You might get the same thing because the brand of McDonald's is such a dominant thing in those related search terms. Last of all, the thing I think is the most interesting for us as SEO is wanting to look at opportunities for our clients and for our sites is called opportunities funnily enough. A list of keywords for which the knowledge graph is displayed but for which the examined domain is not currently ranking for and that's the key there. It's often worth expanding the content accordingly. Thanks, little tips there from SISTRIX at the end. That ‘Opportunities’ one is the key for me, I think, in terms of that was that eureka moment when Mark and I were playing with it earlier. We were thinking about, okay, so how can we use this potentially for some of our clients and thinking about how useful it is for us. That opportunities is other keywords that you are not currently ranking for that you could then include in your content plans and explain that to your clients and justify that to the people you're reporting to, to understand this as a knowledge graph that this keyword could rank for, we could create this content and this area of the site to optimize for that kind of thing.

Mark: When you look at how knowledge graphs actually appear in the SERP as well, something I've always pointed out, a good example is say with actors, it will show you very specific information about the actors. Normally, if you type in Brad Pitt or something in the knowledge graph, it'll show you information that people have commonly requested about that entity. It'll show you things like how old is he? Is he single? It'll show you an image because lots of people when they search for Brad Pitt, go and look at images for instance. That's data that Google knows needs servicing in the knowledge graph. Why I think that the opportunities in this context around the knowledge graph is interesting is because it's another step away from just looking at relations and keyword data. By that, I mean traditional keyword tools work by you putting in Brad Pitt and then it shows you mainly keywords where people will have also typed Brad Pitt in and then all the variations. By looking at the entity, you are not always restricted to that kind of keyword ball and chain that links the intent. A little bit like why we profess so much about AlsoAsked and the people AlsoAsked data because that's not constrained by keyword either because the questions sometimes don't contain the root keyword because it's just the intent has gone in a different direction. I think it's a really interesting place to start with the opportunities.

I think that probably over the next few weeks as this starts to bake into our brains, we all have ideas about how we can use this other data and maybe get this into our workflows because I haven't seen many tools do this in terms of looking this closely at knowledge graph data. There are lots of tools that try and work out entities and entity relationships on websites. That's super common now because that is an area of focus around SEO and content, but having it at this scale as Google displays it I think is interesting.

Jack: Yeah, definitely. As far as I know, this is the first time one of the SEO tools is able to do this. When I spoke to Steve about it the other day, Steve from SISTRIX, he was saying this is essentially the only way to do it at this scale with these domains and with this keyword. Obviously, SISTRIX has one of the largest keywords and SERP databases in the world, so having this data and being able to do this for these knowledge graphs and being able to see it, like you said, research from a different perspective and see entities and brands and your clients from a different perspective, I think can be really, really useful. Just playing around with it with a few examples of household names and things like that and then actually putting in some of my clients' names or related terms to my clients suddenly kind of, like you said, solidified it in my brain and be like, "Oh yeah, actually that could be useful for this thing."

Understanding different relationships between entities and how search intent changes from entity to entity and topic to topic can be really, really useful data. This is a step in that direction that I think a lot of people looking to get into entity SEO, like I said, I'll talk about that more next week with Sara. But yeah, I think this is a fantastic step in that right direction.

If you are on SISTRIX, you can activate this. You need to tick the little button on SISTRIX Labs to say please activate the knowledge graph tool for me. Then you'll have essentially full access so you can dive around and have a look and play around with different domains. You can also use it from a keyword discovery perspective as well. If you click on the name of the knowledge panel opens up from keyword discovery and that essentially lays out exactly as Mark was saying, you can see the SERPs, you can see search volumes, all that kind of stuff for all of the keywords there as well regardless of whether you rank for it or not essentially. That is a step back to look at the knowledge panel as a whole.

Next up, we have continuous scroll. Yay! Continuous scroll on desktop this time. Isn't that exciting? I think you made the joke on Twitter when we discussed continuous scroll a few months ago now.

Mark: Almost a year ago. Was it?

Jack: I haven't been doing the podcast for a year so it can't be that. Six months, nine months, something like that.

Mark: It felt like a long time ago.

Jack: Essentially what this means is page one rankings for everybody. Yay! Candour is now guaranteeing page one rankings for all clients.

Mark: Along with everyone else.

Jack: Exactly. But yeah, I think this is very interesting and we actually have a statement here from Google that clarifies the rollout process and how you can see, hopefully, you can see on your end of things. It started rolling out for English search results in the US and may take some time to roll out for all users to see. Do bear with Google while it's rolling out. If you're outside the US, chances are it's going to take a little while longer. I know us over here in the UK are often left behind for this sort of stuff, so us and the rest of the world will eventually catch up with this sort of stuff.

Here's the statement from Google discussing this announcement and we got this directly from Search Engine Land. This was actually a statement to Search Engine Land, so thank you Search Engine Land for the info here. "Starting today, we're bringing continuous scrolling to desktop so you can continue to see more helpful search results with fewer clicks. It's now even easier to get inspired with more information at your fingertips." What a salesy sentence that was from Google. Get more helpful search results... Yeah, great. Thanks, Google. Next, Google says, "Now when you scroll you'll continue to find relevant results so you can discover new ideas. When you reach the bottom of a search results page, up to six pages of results will automatically be shown until you see a more results button if you wish to continue further." It's interesting to see that there is actually a cutoff point. It is not infinite scroll, it is a continuous scroll up to six pages and if you really want to continue after that, you can click more results and see past there as well. I think it's nice to get those details so we have a bit of clarity, we're not just saying it's every result for this keyword and you're going to just keep scrolling forever. There are actually limitations to that and how much you can see in one scroll.

Mark: I can hear the flapping of ranking tools in the distance.

Jack: Yeah, this is going to be tricky.

Mark: I have a question. What's going to happen with the ads?

Jack: Good question. They must dynamically insert every certain amount of-

Mark: Every third result.

Jack: Yeah, pretty much.

Mark: I guess, obviously, you've got the top, bottom ads and I guess not that many people click on page two ads or they are going to dynamically insert them on the kind of gift that we've got, can't see any more ads.

Jack: This is the GIF once again provided by Search Engine Land. I'll include that in the show notes for you so you can see that, listeners, and follow along with us.

Mark: Maybe there's a trade-off in usability in terms of it will make more people use search so they'll get more ad revenue on that first page anyway where the bulk comes from.

Jack: Yeah, that's a good question. We're seeing now, we see a lot of different SERP features. Most of them take up that real estate at the top of the page, but you do see a lot of stuff also happen. At the bottom in page one, you do see some SERP features pull in there. People also ask something we also talk about a lot, thanks to AlsoAsked and related searches are like where page one would end and then it just keeps scrolling, which I think is interesting.

With the power of dynamic insertion into podcast ads and things like that, I know I'm thinking this from a very podcast-y perspective, but the pre-roll, mid-roll, post-roll mentality in podcasting where you have before you start the show, say for example, if we were doing it would be before you even heard the intro music or heard me say hello and welcome to the show, that would be, "This episode is supported by SISTRIX," and do my usual ad roll there. We do an intro and a mid-roll, the one you just heard about the knowledge graph tool is what we'd call a mid-roll ad in podcasting. Like you said, Mark, could you do it every five results for this scroll, you get this particular ad?

Mark: It just feels too good to be true. It feels like there's something bad coming.

Jack: We talk about it so much about how much money Google makes from ads. They're not just going to suddenly, "I don't care with the ads."

Mark: Yeah, just have your first 60 results and 300 ads delivered in your sleep.

Jack: Straight to your, I don't know.

Mark: I don't think it's going to change too much for SEOs in terms of the first 10 results at least. I think you're going to still get the same clicks. Maybe it will break the spike you get on position 11, so you'll probably get I guess a more linear drop off in clicks, which is-

Jack: I've seen some really interesting click-through rate studies where they show bottom-of-page two is also a very good area for click-through rate and where-

Mark: Desperation sets in.

Jack: Maybe, yeah. Literally, the method of navigating those pages doesn't necessarily scroll you back up to the top of page two when you hit the bottom of page one. Actually, you see results 20, 19 and 18 before you see results 11, 12 and 13. There were some cases where if you were 18th or 19th for a particular keyword, you would get a higher click-through rate than if you were 11th, 12th, or even 7th, 8th or 9th, which I thought was really interesting.

I am fascinated to see people do some studies from this and see how this affects from a user perspective, how many people will even notice is my question I guess, in terms of... We are all up in arms about it because we're in the SEO industry, but I don't know, my mom or dad is not going to be thrown scrolling through and be like, "Oh, it didn't take me to the second page. It's a miracle!" Who actually cares? It's going to make her probably won't even notice smoother user experience. I think a lot of people may not even negative or positive, just like, "Oh, I didn't even notice. Yeah, good point." I bet if I go and talk to my non-SEO friends say in six months when this is all vetted in and properly settled in and stuff, they won't have even noticed at all.

Mark: We have a helpful content system to use the new lingo update. We spoke recently about the helpful content update, which was in August of this year.

Jack: Oh, we're not allowed to say update anymore.

Mark: No. The helpful content system-

Jack: There we go.

Mark: ... update in August this year, and I almost missed this one, actually. There has been another helpful content system update as of the 6th of December, which as usual is going to take a couple of weeks to roll out. To quote Google, they said, "This will help Google systems detect more forms of low-quality content created for search engines and primarily not for people." What possibly could they mean by that?

Maybe it's all the people that have gone wild with GPT Chat and the version as it's been dubbed 3.5 of GPT. I've seen, obviously, they reported I think over a million users in the first week for their new GPT 3.5 and I've seen tweets from all kinds of people who I don't think had explored GPT before this now getting very interested because it got a lot of press pixels, press inches and there's especially developers who obviously are just like, "Cool, I've whipped up this API to generate articles about dog products for this dog e-com site, LOL. Look at the go, it's doing a great job." Not realizing that hey-ho, we over here in the SEO corner have been trying to game Google for years and I tell-

Jack: And Google knows it.

Mark: ... this isn't going to end well for y'all. That's a conversation I've given up happening now, but I don't think it's primarily because of this, because the other thing that is super interesting to me about this helpful content systems update is it's a global update, which means all languages, which I think is really interesting because not only do most updates from Google, whether they be features or whether they be system updates generally happen in English, especially the really smart stuff where they have to train models. We know BERT was trained in English and it didn't kind of just work in other languages very well.

If Google has solved this problem or got closer to solving this problem in languages that aren't English, that's really exciting for me because I still see spam sites doing well that are just auto-translating content and sticking it back out there even if it's not in other languages. If they can identify that, it would be great. I did see a lot of feedback from the helpful content system update, the first one as well of people being a bit like, "Meh," because it was really built up again that it was going to be big, helpful because there has to be users and even with that horrendous spam site I was running, it got a 20% knock. It got killed by the following spam update in September, I think it was. Absolutely brutally flatlined.

Jack: But not on Bing!

Mark: But not on Bing!

Jack: Bing's still bringing in those spam clicks.

Mark: Bing's my guy.

Jack: Thanks, Bing.

Mark: Thumbs up, Bing. Possibly I think it's too quickly timed I think to be GPT kind of-

Jack: A reaction to that one.

Mark: Yeah, I think it might have pushed out earlier because of GPT and they may have done some fine-tuning to target that because I think that's going to be the big problem for the next 12 months is people getting excited, churning out content in GPT and just trying to rank for it. Again, really interesting, fascinating discourse around having language models to answer queries. We've been playing around obviously in the office doing all kinds of fun stuff with it.

Jack: You say we, mostly you.

Mark: Mostly me, yeah. Hey, creative play is important part of that. That's what I'm going to tell everyone, but for code stuff especially, it's whipped up code and corrected code quite quickly.

Jack: And Excel formulas and stuff. That's really interesting.

Mark: And Excel formulas, yeah. Jono Alderson actually from Yoast raised the point that the future in that interface isn't, "Oh cool, we can generate content for our website ahead of time." It's that you won't need to generate the content ahead of time because you can just dynamically generate it on the spot when you need the answer to that specific question. It's kind of like, "Oh yeah," which then very quickly leads to the point of, well if that's the case, why would Google want anyone else doing that including pre-generating it for a website when they can do it through, as we've spoken about for systems like their multi-modular, unified search stuff, the MUM stuff they're working on. If they can just generate an answer, why would they want anyone else to do it and possibly distract from their revenue stream. HCSU, helpful content system update.

Jack: I love the purposeful pronunciation-

Mark: I'm trying to make it stick. It took a while for webmaster tools to change the search console in my brain. Next couple of weeks, it's going to be rolling out. That's pretty much in time for Christmas.

Jack: We've seen a couple of people covering it. Our man in the trenches, Mr Glenn Gabe, always fantastic for getting a snippet and a sneak peek at big sites that are being hit by this stuff from Google updates, has seen a fair amount of volatility and a few shifts for some sites he's having a look at. It definitely seems to be some movement that seems to have kicked in around, like you said Mark, starting on the fifth. This kicked in around the sixth or the seventh in terms of visibility drops that Glenn has seen. He's going to be monitoring that as he always does, so highly recommend you go and follow Glenn on Twitter to keep up to date with his updates. To wrap us off, little bit on the page here that was updated about the helpful content side of things from Google I thought was interesting and again, trying to catch people out with a lot of people being picky about wording and stuff. The updated wording from Google on that page is any content, not just unhelpful content on sites determined to have relatively high amounts of unhelpful content overall is less likely to perform well in search, assuming there is other content elsewhere from the web that's better to display. For this reason, removing unhelpful content could help with rankings of your other content.

I thought that was a very interesting clarification there. Any content, even if you have just a few bits of it in a much larger article, can affect the visibility of that article and removing it can apparently, according to Google, improve the rankings of that content and related content on your site as well. They're really picking-

Mark: They're bringing this stick out.

Jack: Yeah. Again, don't want to use the word penalty, it's not a manual action, it's not any of that kind of stuff, but they are really picking it out and you can get a manual action if you push your luck, but they are really keeping a close eye on this and you will get a whack on the back of the hand if you are generating rubbish, unhelpful content even in a sea of seemingly other good stuff. If you do push your luck too much, then Google will seemingly catch you out and reduce your visibility, reduce your rankings for that.

Mark: And that's not what we're here to do.

Jack: Exactly.

Mark: We're here to do the opposite of that.

Jack: That about wraps us up for this week. Thank you once again for joining me, Mark, talking about all the latest SEO news. Like I said, next week, I will be joined by Sara Tajer. We'll be talking about entity SEO. It's a very interesting conversation if I do say so myself. I say it's interesting because I had a lot of questions for Sara and she was a very interesting guest to discuss with. Stay tuned for that next week.

We will, of course, have a couple more episodes before Christmas, which is very exciting. I know Mark and I, we've talked about a kind of end-of-year wrap-up kind of thing. I think we'll plan to do something like that just around that Christmas period. Then, of course, we'll be back in the new year with lots of other extra exciting guests I've already got planned as well as the new live streams we'll be working on with our friends over at SISTRIX as well. Lots more Search With Candour content coming in the next couple of weeks and in the new year as well. Thank you very much for listening and we will see you next week.