Candour

SEO News: ChatGPT vs Bard, 'Trending' label on People Also Ask and 'Content Ideas' in Google Search Console

Or get it on:

Show notes

This week, Jack Chambers-Ward is joined once again by Mark Williams-Cook to discuss:

Transcript:

Jack: Welcome to episode 57 of season two of the Search With Candour podcast. I'm your host, Jack Chambers-Ward, and I'm joined once again by the one, the only, Mr. Mark Williams-Cook. How are you, Mark?

Mark: Hello. I am very, very well.

Jack: Welcome back.

Mark: Thank you.

Jack: Been a while, once again.

Mark: It has. Not until last year you told me.

Jack: Yeah.

Mark: I thought we'd done one more recently, but apparently not.

Jack: I thought we had as well. I thought it'd been about a month. It's actually been like seven weeks. It was the one we did where it was the kind of culmination of 2022 and our predictions for 2023 as well.

Mark: And a lot has happened.

Jack: A lot has happened.

Mark: A lot has happened.

Jack: We have a lot to talk about. We actually haven't got that many topics specifically, but there's a lot to talk about within these topics. This week we'll be talking about Bing and ChatGPT versus Google and Bard. Oh yeah, there's going to be a lot of AI talk this week. A new trending label has appeared on the People Also Ask results and the experimental content ideas feature is coming to Google Search Console. Before we get to the topics, before we get to battle of the AI and search engines and all that messy things that are going to happen in 2023, I'd like to give a fantastic little shout out to our fantastic sponsor SISTRIX, and you, dear listeners, can go to sistrix.com/swc, and you can get some of their fantastic free tools such as their SERP Snippet Generator, the hreflang validator. If you want to check your sites' visibility index and of course the Google Update Radar. If you want to get TrendWatch, SectorWatch or IndexWatch, which I have covered over the last couple of weeks as well, you can go to sistrix.com/blog, subscribe to the newsletter and you'll get them delivered to your inbox directly every single month. Shall we dive into some AI search engine wars Mark? I feel like that's what's happening in 2023.

Mark: It's interesting, isn't it? I think I'd like to talk a little bit about Bing. I want to talk as little about ChatGPT as possible because I just feel completely saturated.

Jack: Same.

Mark: But in the context of what we do day-to-day as SEOs, I think the more interesting thing for me that's been happening is just how badly the Google event went versus how well the Microsoft event went.

Jack: Which is not something we usually say in search very much. It's not usually like, oh, Microsoft nailed it and then Google kind of messed it up. Tends to be the other way around.

Mark: So I'm sure everyone kind of in the industry is aware of Google's rather, I think disastrous is a fair thing to call it as it wiped $108 billion off their market cap immediately.

Jack: It was like a 9% drop in shares or something like that.

Mark: Yeah.

Jack: It was absolutely insane.

Mark: It was bad Bard. So for those that don't know what, for the few, maybe two listeners who don't know, I don't know, everyone must know, but yes, obviously we know Bing has been talking about integrating the kind of chat AI into search. There was all that news about the Google red light, red alert. We're doing something and people like me saying, well actually Google's been doing this AI thing for ages. I'm sure they've got it all covered.

Jack: Yeah.

Mark: And then I didn't actually see the event, but apparently someone forgot to take a phone as well, which had some of the presentation on it, which is like 101 when I'm organizing SearchNorwich for 60 people.

Jack: So SearchNorwich is more prepared than the Bard announcement.

Mark: Than one of the biggest tech companies showing the world...

Jack: One of the biggest companies, full stop, in the world.

Mark: Yeah.

Jack: Yeah.

Mark: So that was kind of weird. And then the main thing that caused this outcry was I think there was two things actually wrong, but essentially in the ads that they ran about Bards, it got some of its facts wrong, which for a pre-packaged video is kind of a thing you'd maybe check.

Jack: Yeah. Yeah.

Mark: Before showing everyone.

Jack: It was specifically talking about the James Webb Space Telescope. There was a little snippet that had, hello, I have an astrophysics degree, I can talk about this kind of stuff. And essentially what it says is that the James Webb Space Telescope took the first pictures of exoplanets, which are planets outside of our solar system, typically orbiting other stars in other solar systems essentially. And that's not technically true. It is the first telescope to take a direct image of an exoplanet in front of another star because usually basically stars will be too bright and planets are not bright. So basically if you look in that direction with a fairly low resolution, you would just see a burst of light that is the star and you can't even see the planet because it's so much smaller.

The James Webb Space Telescope was the first telescope to take a direct image of an exoplanet crossing in front of a star. That is not the same as it took the very first image because we have indirectly taken photos and had images of exoplanets that was done 15 years ago by a completely different observatory basically and a different telescope that didn't get any credit at all. So it was kind of this weird, you're not wrong wrong but you're also not correct. So it's this weird... I think we see that a lot with AI results where it kind of gets the gist of something but doesn't actually get the data correct. And as you said, it's a pre-prepared statement thing. Edit it. What are you doing? How could you let it slip through?

Mark: I just imagine someone with your background, Jack was the person that saw that and was like...

Jack: All right. I saw it on Twitter over this.

Mark: Yeah.

Jack: I saw angry physicists on Twitter just being like, "I worked on that program. I was one of the people that worked on the original telescope that actually first observed exoplanets. You bastards. You didn't give us credit." But yeah.

Mark: So as I said at the top of the show, I want to avoid I guess slipping too far into the kind of AI ChatGPT conversation because it's all we are hearing about now that... The overview I would give is that in its current form, say ChatGPT as a large language model, there are some limited applications that are helpful for SEO. Certainly not the things we've seen where people are saying to ChatGPT, "Hey find broken links on this website. Hey cluster keywords."

Jack: Yeah.

Mark: Hey, like... It can't do those things. Or it can pretend really well and fool you it can do those things but basically it can't do those things.

Jack: Yeah. It will often, as we said, provide an answer that is not necessarily wrong but is not right in many ways if you actually know the subject, if you go into a subject not knowing it and just ask ChatGPT, it'll be like, "Yeah, this is correct." And there's no way of fact checking that if you don't know, if you're not going and then doing your own research and all that kind of stuff. So yeah, it's an easy pitfall to fall into for sure.

Mark: So this is what I'm interested in, which is Bing has obviously opened up signups for its new search with this AI chat integrated and we'll put a link in the podcast notes at search.with.candour.co.uk to go over and sign up if you haven't already. They offer you a kind of jump the queue if you download Edge and set everything, all your defaults to Microsoft, which I'm prepared to wait to be honest.

Jack: I saw a tweet, I apologize I'm not crediting this, but it was "I would rather eat glass than set Edge to my default browser."

Mark: I mean, I don't see why, Edge is Chromium now, but-

Jack: It is.

Mark: ...I think it's just a longstanding like...

Jack: Thing.

Mark: Anyway, we're getting around the point here. So our friend Brodie Clark has got access apparently and posted today a really interesting screenshot of this live AI chat working in Bing and he asked, "Does Brodie Clark like the new Bing?" Which I thought was an interesting question because that should be a pretty hard question for a computer to answer because it's about a specific person and it's about a subject and it's a nuanced thing.

Jack: It's an opinion.

Mark: Yeah. So it is a tricky thing and this is the answer that it generated. "Hello, this is Bing." I like it letting you know it's AI.

Jack: Yeah. Yeah.

Mark: "According to the web results, Brodie Clark is an independent SEO consultant and organic growth advisor" and it's citated sources here. So LinkedIn.

Jack: I do appreciate the citations.

Mark: Nice. "Who has posted about the new Microsoft being on Twitter and LinkedIn. He said that he got access to the new Microsoft Bing on February the 8th and that the chat functionality is epic and works well." Interestingly it's bolded things like epic and works well as I assume it realizes that's connected to the sentiment related to the question. He also wrote a review, bolded, of Bing's new AI powered chat results on his website. Again citated brodieclark.com. "Based on these results it seems Brodie Clark likes the new Bing." Smiley face to cap it off, which I found kind of creepy, the emoji, but as a result I thought that was bloody fantastic.

Jack: Yeah, yeah.

Mark: It's correct. It's worked out who Brodie Clark is, if it's got it wrong, I guess it's given you the source where it's got that from. The interesting thing, especially for me here is that, so this is based on an LLM, a large language model which is pre-trained, right? And this is one of the shortcomings that we've much talked about of ChatGPT, which is basically it goes up to 2021 and then it knows nothing after that.

Jack: Yeah.

Mark: This obviously, I think we can guess from saying according to web results, what it's doing is using the web data in Google's index to just pass, use the language model to pass the information it can see.

Jack: In Bing's index, you mean.

Mark: In Bing's index. What did I say?

Jack: Google.

Mark: Google. Definitely not Google.

Jack: Definitely not Google's index. If any is, that's very naughty.

Mark: Well, maybe. Bing was once caught scraping Google on pertaining the same results.

Jack: True. True.

Mark: Sorry Bing, I apologize. So well it does say according to web results as well, it doesn't actually give a source for that.

Jack: That's true. That is true.

Mark: It could be, I don't know.

Jack: According to recent Yandex leaks.

Mark: But what it is essentially doing here is ChatGPT already has that functionality. If you paste a bunch of text in, you can say summarize and give it conversational instructions to call out stuff from that. That looks like what this is doing. And that's interesting because you've got whatever pre-baked baked, I hesitate to call it knowledge because it's not really knowledge because it's just a statistical probability of a bunch of tokens. But this seems to be an interesting use because they're overlapping the search result relevancy with the LLM conversational kind of input. So they've got a decent answer there. I think what it gives to the LLM is essentially some neater parameters to work into. There's another step maybe of fully integrating it with a search engine's knowledge graph. I don't think they're there yet with that knowledge graph. But to give you an example, you can go to ChatGPT, right? And you could say to it, "Make me a new recipe with Huel for dinner." Right?

Jack: Right.

Mark: And it will be like, "I got it this new recipe for you, it's Huel vegetable stir-fry." And it just gave me the instructions to make a vegetable stir-fry. And at the beginning it was like, "Put a scoop of vanilla Huel in there." And it's like, this is the thing of like...

Jack: A nice sweet and sour sauce with vanilla Huel in it.

Mark: But this perfectly...

Jack: If you want a chunky, oaty, dusty...

Mark: This is perfectly describes it to me what you said earlier, which is it's not incorrect because it's edible, it's food, it's a recipe, but it's quite obviously not correct.

Jack: Yeah.

Mark: Because no.

Jack: As a man who consumes a lot of Huel, even Mark Williams-Cook would not stoop that way.

Mark: Yeah. So I think this is, Bing has done a good job in that this mesh of search results, I don't know if they're live searching or whether it's, my gut would say they're probably checking just straight what's in their index. I haven't thought about that too much. But that seems to be working quite well and is quite exciting. From an SEO point of view, if more people switch to Bing, I think that's only good for the industry, more competition.

Jack: Yeah.

Mark: I saw someone talking about how basically Microsoft is prepared to go at this no matter the cost because their revenue as a company from search is way lower than Google in terms of their percentage dependence.

Jack: Right. So if they take away a percentage, it's a much bigger difference to Google than it is to them.

Mark: Yeah. Well the cost incurred to Google is I guess kind of linear to their search market size.

Jack: Right.

Mark: And Google obviously can lean on the money they generate from search to do stuff like provide their cloud services, whereas Microsoft make a lot of their money from cloud services. So if you said, okay, for every search we do, we are adding a penny, cent, onto your cost for Microsoft, that's probably quite doable because a lot of their revenue comes from other sources. For Google, that's like a massive cost.

Jack: Right, yeah.

Mark: So it's interesting in a way that it's like, okay, cool, it benefits us but it really hurts you from a competition point of view. And I found that really interesting. So I'm quite excited. I'm not particularly worried about the whole AI in search thing yet, rightly or wrongly. I don't think SEO's going away. I just think we're going to have to adapt and we're still dealing with systems here that can be optimized for. And I think it's super exciting because the fundamentals of SEO I don't think have changed particularly for over a decade.

Jack: Yeah.

Mark: We've been doing the same thing. There's different, there's schema now or there's different tactics, but...

Jack: "Just make good content," says Google for the 114th time.

Mark: But I think this can fundamentally, it's a big change and that's exciting.

Jack: Definitely. And SISTRIX actually dove into a little bit of analysis about the Google Bard side of things. Johannes from SISTRIX talked about the presentations we've just been covering and essentially the risks associated with that and the fact that Google still has a lot more power, as you were saying Mark, there's a lot more that they can do in terms of SERP features and all this kind of stuff, but that could change a lot when it comes to AI stuff and we do run the risk of inaccurate data being pulled through to, you mentioned the knowledge graph earlier, something I covered recently with Sodiq Ajala talking about mistakes that will rule you out of the knowledge graph and things like that. You could see a lot of different changes and a lot of volatility as soon as this stuff gets people...

If it's citing like LinkedIn as a source, which is a very common one for knowledge graph stuff, it seems to also be what Bing is doing for their side of things as well from the little snippet there from Brodie Clark, it's this interesting kind of introduction of volatility that I think a lot of SEOs don't want to see in their SERPs. We feel like a lot of people track that kind of thing when it comes to Google updates and all that kind of stuff. And I think it's very, very possible that there could be a big shift in SERP features when AI is properly integrated and how that will affect us as SEOs and how much affects your customers and your websites and all that kind of stuff is up for debate I guess. But I know a lot of people put a lot of weight on knowing what a SERP is going to look like for a particular result and understanding where their sites or their clients are or whatever it is, and understanding what Featured Snippets you have or whether you have this knowledge graph or whatever. I can see a lot of SERP features changing quite a lot when this is introduced.

Mark: The article written by SISTRIX gives some thoughts based on the type of search results. So they're saying things that give immediate answers already, like when do the clocks go back, what's this sum, kind of thing. The traffic's already kind of gone for them in terms of Google. The clicks are already low. They highlighted Featured Snippets. So they've said, "Anyone who achieves a high proportion of organic traffic via featured snippets today should expect this traffic to disappear in the future." Yeah, I think that's probably correct because the biggest issue that I've seen with Featured Snippets isn't Google ranking the wrong page. It's sometimes getting the wrong bits of information from that page and then presenting them outside of the context of the website to give a wrong answer. We've seen those medical queries...

Jack: We bring this up every few weeks, pretty much that whole stick your fingers down your throat to make sure you're not choking thing. That was like, that's a snippet from a longer sentence says, "Do not stick your fingers down their throat if a person is choking." But yeah, just take off the "do not" and then yeah, just stick fingers down your throat. That'll be fine.

Mark: I think this is where the chat models will really help though, because that Bing past by language model, I think you'll get much higher accuracy of it understanding the, oh, okay, that do not bit was really important.

Jack: Understanding context is so important, right? Yeah, absolutely.

Mark: Yeah. I mean, I'm with SISTRIX, I think it's a bang on analysis. I would imagine that kind of AI answer will be replacing Featured Snippets because that's good for the user. Again, I don't think though, in terms of SEO, it will depend how it pans out with citations and et cetera. How it features sources, and therefore what the impact on clicks are. Because there's still going to be a way to optimize for that. It's still going to pick what it thinks is the best answer from the best sources. So yeah, jury's kind of still out I think on that.

Jack: Yeah, definitely. If you want to dive into that analysis from Johannes from SISTRIX, go to sistrix.com/blog and of course links will be in the show notes at search.withcandour.co.uk.

Mark: People also ask trending.

Jack: Speaking of SERP features, more SERP features.

Mark: SERP features.

Jack: Perhaps Candor's favorite SERP feature since you and the developers literally built a tool to analyze this specific SERP feature, Mark.

Mark: If you haven't heard of it's alsoasked.com.

Jack: Oh, subtle plug. Nice.

Mark: Thank you.

Jack: Nice.

Mark: So now we have on People Also Ask results, some of these are being labeled with trending and this was discovered originally as far as I can tell, and tweeted by an SEO named VJ on Twitter. I'll put a link in the show notes. Search.withcandour.co.uk. And he was kind enough to provide the query that he used, which was Valentine's Day, New York, 2023. I've been unable to replicate this.

Jack: Same.

Mark: I don't know if you've...

Jack: I have tried with a couple of VPNs and stuff, tried in a couple of different regions. I tried to replicate, so the example VJ gives is Valentine's Day, New York, 2023. We are in fact recording this on Valentine's Day specifically. And I tried to replicate it, didn't work. I tried different regions, I tried different regions through the VPN and stuff. I tried other trending things. So wanted to mention the things that happened like the Brit Awards recently and all that kind of stuff. And couldn't find anything unfortunately from my end of it.

Mark: So there hasn't really been any official, as far as I'm aware, information released about this trending label. From what it says on the tin, I would assume it means it doesn't necessarily have to be a new search, it's just something where this search volume has suddenly increased OR which, we've had that data for some key phrases available through Google before, through various tools of theirs. I think it is particularly interesting for PAA data. One thing we have in the roadmap for Also Asked after the API is PAA monitoring. So getting alerts when new PAAs appear-

Jack: Interesting. Right, yeah.

Mark: ...From search queries. And of course we immediately got asked by several people, are we going to be integrating trending into Also Asked? So I guess the answer to that would be, first I need to actually see it because I haven't got to see it, but I'd be interested if this is something Google's going to stick with, we won't probably look at integrating it until we are sure that it's something Google's going to roll with. Because these tests are quite common where Google changes something and then a few people talk about it, they see it and then it vanishes sometimes for a year or two and then it comes back.

Jack: Sometimes forever.

Mark: Sometimes forever.

Jack: Sometimes it never comes back.

Mark: Yeah, sometimes it makes it to that great big Google graveyard of...

Jack: The cloud in the sky.

Mark: So yeah, particularly I think it's interesting. I mean, to me it does bring more focus to PAAs. I've talked a lot about them. I think they're one of the best sources of data that you can get for keyword research and looking at new topics in PAAs, trending topics. I like them just because if you can get there before anyone else basically guaranteed to rank.

Jack: And trending is a perfect example of that. I know it's something you've touched upon Mark, where how quickly PAAs can change, literally by the minute, by the hour as things are happening as a big world event or I mentioned like the Brit Awards or the MET Gala and all these kind of big events that are happening. If you Google celebrity's names, they will change it. Like, oh, what are they wearing? What song did they sing at this thing or whatever. That will change as we go through. And yeah, I guess is an indication of that and oh, people are searching around this topic and this is now coming through and yeah. Yeah.

Mark: Well I hope they keep it because I think it's cool.

Jack: Yeah.

Mark: So if anyone at Google's listening push it to live, to main, let's have it.

Jack: "Let's have it," says the founder of Also Asked. So moving away from SERP features, should we dive into a bit of Google Search Console?

Mark: I love a bit of Google Search Console as well.

Jack: Yeah, yeah. I love a bit of Google Search Console and this is something that was actually experimental, which is again kind of touching on similar kind of things we've been talking about with the SERP features and AI and all that kind of stuff, lot of experimental chat this week. This is something that is essentially some content ideas within your Google Search Console and people have either started clapping and applauding or have gasped in shock and horror. I assume we'll have one of those two reactions.

Mark: I think it's really cool. So this is, apparently this was launched in back in December and was only available in India and Pakistan as a replacement. And we did talk about it many, many, many episodes ago. The Google Question Hub.

Jack: We did.

Mark: Which was highlighting potential queries where Google didn't necessarily feel like it had good results, which is potentially a gold mine of information. Which is them saying we have demand and no supply. Now I saw an Andy Simpson tweet about this from the UK, so it seems that this has also been rolled out to other countries now. Again, very disappointingly, very sad. I haven't been able to find it in any of the accounts that I've been working on.

Jack: Same. I scrolled through, I had a look on all of our different clients search consoles and couldn't find anything. And it's apparently per property as well from what I've been seeing, where you can see it pop up in certain properties for a domain or whatever and then not in other ones. So it seems to be very, very experimental and very kind of almost hit-and-miss I guess, where certain things, and there's no consistency attention. Just because you have access to a thing doesn't mean it will suddenly appear across all of your clients or all of your sites.

Mark: Yeah, I mean, that again, the kind of keyword analysis that we have to do, we've got platform SaaS tools with metrics like keyword difficulty, which whatever you think about the value of those metrics, are essentially there to try and help you find gaps, which this is a search term, this is the value of the search, i.e. how many search it has a month, whatever. And they give you normally the cost per click to try and give you some metrics to calculate commercial value. And then this is the difficulty, which is essentially how much is the competition. As far as I'm aware, we've never had this level of data from Google, which is basically telling you where the gaps are.

Jack: Yeah. And we talked about Search Console as being essentially the closest you get to first party data, right? Because as you said, there are so many tools that do this kind of thing already and give you a metric or a number or a score or whatever it is. Oh, there is probably a gap here. This thing is going to be easier to rank for than this thing. But very, very rarely do we get that kind of data and even an inkling of that kind of thing from Google directly. And interestingly, the little snippet of the top there, that is kind of the announcement of get inspiration for new content. Content ideas are based on searches that might lack good results. I think that ties back to what you were just saying about PAA as well, Mark. Where here's an opportunity, this hasn't been covered yet. If you can be the first to get to this thing, you have a very good chance of ranking for it. If you're able to answer this query, tackle this topic, then there's a good chance of ranking for it pretty quickly.

Mark: The two things interesting me, particularly from this data, firstly just if it was made available by the API, oh my, I think we're going to see some spam. But secondly, as you said, the definition of those questions was we don't think we have good answers. And in this screenshot, one of the examples is, How often do you fight with your spouse Reddit?" And why I find that so interesting is I guarantee there is a post on Reddit asking how often do you fight with your spouse. It just sort of sings to me. Maybe Google doesn't particularly like ranking sites like Reddit where it can, because one of the kind of litmus tests I've always used for content is oh, Quora's ranking, Reddit's ranking top. I reckon...

Jack: We can outrank that.

Mark: I reckon if I just make a "proper page"-

Jack: Yeah.

Mark: I can probably outrank it.

Jack: Yeah.

Mark: And it just interests me because it's unlikely you are going to outrank Reddit for a search term that includes Reddit.

Jack: The word Reddit. Yeah.

Mark: So these are not vetted in terms of the intent of the search, at least to any smart level. It's just saying that we don't think we have good content. And I see you Googling it there.

Jack: There is literally on r/askwomen there is how often do you fight with your boyfriend/girlfriend/spouse and this the one result.

Mark: So, yeah. Word for word.

Jack: Yeah.

Mark: So there is that word for word query answered on the site. So I just find that interesting because it does say to me, it backs up that Google's like, if I can't find anything else, I'll use Reddit and otherwise I want something that I know who the author is or I know who the website is-

Jack: Coming back to the E-E-A-T stuff, right?

Mark: Yeah, I just find that really interesting and I hadn't seen anyone else kind of mention that before.

Jack: Yeah, I think it's really particularly... It almost seems similar to PAAs, right? These seem to be question based, they seem to be the kind of things... I don't know how tied it is to a particular topic or a particular domain, whether this domain covers, I don't know, like divorce lawyers or something. Because all the questions in this screenshot here are again available at search.withcandour.co.uk if you want to check it out yourself. They're all about spouses and how often do you fight and what to do when you separate from your spouse and all this kind of stuff. I guess you need to judge how relevant that is. And again, without us having the context and the insider knowledge and being able to access it ourselves, there's no real way of telling how relevant it is. And yeah, is this similar to PAAs where there's nothing that is really covering it and it's essentially "no volume?"

Or is this just Google wants more data? Can you give us more data, please provide us more stuff to crawl and index basically. It's a weird thing for me. I think it could be really, really useful. You're totally right. Something like an API, how much use I've had out of the URL inspection tool API over the last year or so has made a huge difference to my workflow. And then this could be a massive change for a lot of content creators. So yeah, it's interesting, very interesting and unusual. I think as we were saying from Google, to have such a direct comment on what is ranking, what is not ranking, we don't think we have a good enough result for this. How do they judge that? Like you say, oh, is Quora good enough but Reddit's not good enough? It's a weird kind of...

Mark: Quora is definitely never good enough.

Jack: Quora's definitely not good enough.

Mark: Sorry guys.

Jack: Sorry Quora team if you are listening. But yeah, I find that very interesting. What kind of methods and metrics are they using to judge how good it is? Do you know what I mean? It's a kind of, feels a bit wibbly-wobbly to me so far. Maybe that's why it's in kind of the experimental phase.

Mark: Wibbly-wobbly. Yeah. I mean the theme I guess of this episode has been, there is a lot in flux at the moment with AI content, with both search engines seemingly integrating huge new features and then what would be normally much more relevant things. It seems like small news now that we're getting, like content ideas and PAAs changing. But my general advice, I guess, maybe if you are newer to SEO and you're like, oh no, there's all this stuff happening and we're doing this, would be don't worry about it in terms of, you can't...

Jack: Are you about to quote Google here like, "Just create good content."

Mark: No, no. In terms of you can't fight the change. If AI stuff is coming to search, it's coming to search and it's going to be there, the best thing you can do is roll with it, learn how to adapt and embrace it. Not ignore it, don't try and hide it, don't try and argue against it because you'll lose because that's the way things are going, so.

Jack: Yeah. I for one, welcome our robot overlords.

Mark: It's genuine wise. Because I've seen a lot of people get quite stressed about what's changing and is SEO going to exist? And that's the running joke, right, for years is SEO is dead. And even with these changes, I can't see SEO being dead because there's still machines picking up on stuff and analyzing it and they're still using metrics and judging who to trust. And although it's getting closer to that original Google mission of "make good content", that's, when you unpick it, never a simple as it sounds.

Jack: Absolutely. If it was, we'd all do it and then it would just be...

Mark: Then SEO would be dead.

Jack: Exactly. SEO would be dead because everything would be a level playing field and the index would just be full of fantastic content and answers to every possible question you can think of, but it's not. So we've still got a job. You're out there, SEOs, you've still got a job to do. Don't worry. Robots haven't taken over and taken our jobs yet.

Well that about wraps us up for this week. Thank you for joining me, Mark, been a pleasure to chat.

Mark: Of course.

Jack: AI, SERP features, experimental features and volatility and all that fun stuff. We'll be back next week with more SEO and PPC interviews and news as always. But until then, thank you for listening and have a lovely week.