Candour

Episode 113: LaMDA, also covered, AMP and redirects

Play this episode:

Or get it on:

What's in this episode?

In this episode, you will hear Mark Williams-Cook talking about:

LaMDA: Google's breakthrough in conversational technology

Also covered: A new SERP feature being tested by Google and its relation to PAAs and MUM.

AMP: Why are SEOs ditching AMP?

Redirects: Core Web Vitals, redirects, and naughty SEO

Show notes

Aleyda Solis tweet about AMP

https://twitter.com/aleyda/status/1396021867434160128

Mordy Oberstein tweet

https://twitter.com/MordyOberstein/status/1397600283107201026

Glenn Gabe tweet about Also covered feature https://twitter.com/glenngabe/status/1397571565206835205/photo/3

Andrew Charlton tweet about new Google sheets script https://twitter.com/bertiecharlton/status/1396761378267009025

Google blog: Sunsetting the generic rich results search appearance on Search Console https://developers.google.com/search/blog/2021/05/rich-results-search-appearance-sunsetting

Google blog: LaMDA: our breakthrough conversation technology https://blog.google/technology/ai/lamda/?

Transcript

MC: Welcome to episode 113 of the Search with Candour podcast, recorded on Friday the 28th of May 2021. My name is Mark Williams-Cook, and today, I've got a whole range of things to talk about on the podcast, from what people are doing with their AMP, their Mobile Accelerated Pages, in regard to the Google page experience update and what we've learned from that and why they might be doing that. We're going to talk about some Also covered, which is a new type of search feature that looks like it's being tested by Google, and some interesting relations to that and other progress and updates Google's recently made. I've got a tool I want to talk about very quickly. We're going to mention core web vitals being passed over redirects, and touch on another technology "breakthrough" from Google in the guise of Google LaMDA.

Before we kick off, I want to tell you, this podcast is very kindly sponsored by Sitebulb. Sitebulb, if you haven't heard of it, is an SEO auditing tool. It's desktop based for Windows and Mac. They've got a special deal for Search with Candour listeners. If you go to sitebulb.com/swc, you can get an extended 60-day trial of their SEO auditing tool Sitebulb for free. No credit card or anything like that required. I tweeted about this a week or so ago, I love that Sitebulb sponsored this podcast because it's a tool that I've used and we've used in the agency for many years now so I'm really happy to talk about it.

One of the features I wanted to very quickly mention, and this is what makes Sitebulb really good in my opinion, it's all these little details that come together and make the tool really nice to use. And that's that, if you've done maybe a specific update to a certain page, or you just want to quickly check one page, apart from just having to crawl the whole site, they actually have a single page analysis tool built in as well. Normally you would have to set some rules up or just crawl the whole site and then it goes through their analysis engine, which looks at the data and everything they've crawled and tries to work out what issues and what opportunities are present, but Sitebulb have added this really neat little single page analysis tool so you can just look at one URL. So even if you're using Sitebulb, you might not know about that. It's under the tools menu in the upper leftish part of Sitebulb. But going back to the point of them sponsoring the podcast. If you haven't tried Sitebulb, give it a go. Sitebulb.com/swc.

I'm going to kick off with this poll I saw run on Twitter by well-known SEO, Aleyda Solis. And this was on May the 22nd, so almost a week ago. And she wrote, "For those who are using AMP on your sites, is the upcoming change in regards to the page experience update and AMP no longer being a requirement for top stories, no AMP badge to be shown, etc, will it make you change your AMP usage?" She had 352 votes on this poll. Around 40% just voted for they wanted to see the results. So we had results from about 200 people here.

The results are interesting but a little bit confusing maybe. 41% said they were actually going to remove AMP. You remember, if you listened to our earlier podcast, we've spoken several times now about the page experience update and the coming in of core web vitals, which is related to this move by Google, which is essentially saying that for top stories, which is the news stories that get pulled in to the universal search result at the top of Google, currently they'd require Accelerated Mobile Pages, AMP, and Google is removing that requirement. The assumption there is that they've always pushed AMP as this mobile first performance based reasoning for doing this. And with core web vitals now we've got these generic metrics that are applicable to any site and are all around the page experience that AMP shouldn't necessarily be a requirement. Although, of course, they will likely score highly on core web vitals. Google's saying that it doesn't have to be an Accelerated Mobile Page. As long as it performs well with their core web vitals metrics then you can rank in Top Stories.

The headline that search engine Lahm went with this poll was that 40 odd percent of SEOs are going to be removing AMP after this page experience update. I think that's interesting because it's maybe reflective of the fact that there is a developer technical overhead for managing accelerated mobile pages because you're essentially running almost two versions of your site, so it is a bit of a pain and it's obviously not always the easiest thing to do, to look after long-term. And there's some people that object to AMP just based on the grounds of it's Google hosted, monopoly, blah, blah, blah. I don't want to go over that territory again. Lots of people talk about that.

But the interesting thing about the poll was that 23% said they were going to keep AMP, which is fine. I can also understand that in that, if you've pushed maybe a year ago, or a year and a half ago, two years ago, to get this through your development team, to have accelerated mobile pages, to then only find out Google doesn't really care about them as much, it's going to look pretty bad if you actually say,, can we now spend extra time getting rid of them. There's got to be that cost benefit of doing that.

35% of people, and this is what confuses me, so just over one in three said they were actually going to add... they still have plans to add accelerated mobile pages. And it's odd because one, firstly, the question, the poll was aimed at people who are already using AMP. And I guess this might be again to do with, well, we need to improve core web vitals and AMP is one way to do that and maybe it's already in the developer pipeline. I'm not sure you can draw any real conclusion. Like the 40% of people removing, I don't think it tells the whole story. There's a lot of context to the situation people are in technically and internally with the relationships between all the different people involved, the stakeholders in this type of decision.

What I would say is, if you are planning on removing AMP, there's two things that I would keep in mind, which is that, Google is rolling out this page experience update, they've said slowly over a few months. Now my hope would be that as soon as they flick whatever switch it is, I'm sure it works like that, there's a switch, to start rolling out this page experience update, I would hope that would immediately then remove this requirement for AMP and it would balance out those ranking factors.

The thing that would worry me is, Google has talked about this slow introduction of specifically core web vitals around this page experience update, right? Now, if that weighting of core web vitals is going to change over time, I'm not sure, and I don't think anyone outside of Google could be sure about how that would affect those pages which are then competing with AMP pages to rank. It may well not be relevant but it's just something I would keep in mind. And also if you are removing them to do at least a test section first. So don't just rip everything out and then just see how it goes, unless you're feeling particularly brave, and if you do share the results with us. But with anything like this, these updates are complicated. There's lots of interconnected stuff going on so I would just be very cautious especially if you aren't getting traffic through Top Stories and things are working well for you about removing and changing things.

So I just wanted to share that. We thought it was an interesting poll. Interesting to see people stepping back from AMP after this time. Very interesting to see. Realistically, I can't wait to see the actual analysis for how well pages perform in core web vitals versus AMP versus non AMP pages in Top Stories, so whether we will see a difference anyway. Because, again, the way this is worded says that, AMP is not a requirement for Top Stories. It doesn't say that it's still not advantageous to use AMP. I'll leave that out there, and again, I'll wait until we actually see the data to make any decision.

It's just a quick segment because I found this a particularly interesting type of search results. So that is the ‘Also covered on this page’. I originally saw Mordy Oberstein from Wix tweet about this, and he said, "Don't think I've seen ‘Also covered on this page’ as seen here under the featured snippet. For the record, it's not just pulling the headers from the page either." And he's done a chill little video where he's Googled, domain authority is a metrics. Don't know what that search term means exactly, but the point he is showing is there's a featured snippet from moz.com but Google under that has put, "Also covered on this page is domain authority important? What are DA and PA?" Essentially a list of extra questions that that page answers that are related to the search chances.

It's kind of like a horizontally organised People Also Ask box. As you know I'm super interested in People Also Ask data. I think it's hugely undervalued still by SEOs. We see People Also Ask data when you've done a search term and Google says, "Hey, these are maybe other helpful questions you could ask." They seem to be testing out this SERP edition to say, well, you've searched for this, and it might be helpful for you to know this page also answers these questions.

I did a little bit more digging and I just want to give Glenn Gabe credit. I believe he was the first one to notice this. Glenn, I know, linked us in the show notes at search.withcandour.co.uk, so if you want any of the links that I talk about in the podcast, just go to that URL and you'll find them in the transcript. Glenn tweeted back on May the 26th, so a couple of days ago, and he had done a search for, how much do braces cost, and the top ranking site with a featured snippet, again, interestingly is valuepenguin.com. You've then got a horizontal break line and then it says, "Also covered on this page, are braces more expensive for adults? Do all braces cost the same? How much are braces in NY?"

Interestingly, it seems to only be showing on featured snippets, maybe as a way to, again, give an overview of what else is on that page. And if you actually click on those additional ‘Also covered on this page’ links, it will use the scroll to text highlighting. Glenn again, has kindly provided an extra couple of screenshots when he has clicked on the, are braces more expensive for adults or one of these links and it's taken him to that part of the page and highlighted the text.

The last bit I wanted to get out of this, that I found particularly interesting, was in Mordy's tweet when he said, "For the record, it's not just pulling the headers from the page." And I find that hugely interesting because this is finally stitching together all these parts of the machine that Google has been working on, in that, it's understanding the intent that the page is answering. So in the ‘Also covered on this page’, it's not just listing the headers, it's saying this is the related intent that this page can answer.

And on the surface that may not seem important but I think that's hugely important because, again, it shows us divergence away from specific keywords or strings of keywords and Google understanding this topic usually comes bundled with this intent. So for me, in my opinion, it means that it's even more important for these pages to answer that bundle of intent. So not just this question that these are the related things that I expect to see, which again comes back to what I've always said, the People Also Ask data is so valuable for. Again, I'll link to those tweets if you want to see the images for yourself but a really interesting spot from Glenn and interesting observations from Mordy as well.

I want to give a very quick shout out to Andrew Charlton as well on the podcast, who has been doing all kinds of interesting Google Sheets stuff recently. Andrew Charlton is a freelance SEO consultant. He's got an SEO course about forecasting as well. He's on Twitter. You can find him @bertiecharlton. Bertie is B-E-R-T-I-E Charlton C-H-A-R-L-T-O-N. Of course, I'll link to it in the show notes search.withcandour.co.uk.

I saw a really interesting tweet from him on May the 24th, so four days ago, and he said, "Get Google auto-suggested questions in Sheets for any keyword across any language." And he's got again a little video here. And basically, if you follow Andrew, drop him a DM, he will give you a link to a script that you can import to your Google Sheets. And then you can use a really neat function. You can type equals keyword questions and then you just list a keyword and a location and language and it will generate answers similar to Answer The Public, all of the what, when, how, why questions for that keyword. And it's just a really quick way to do some initial keyword research.

Obviously tools like Answer The Public do go deeper, but if you are just very quickly wanting content ideas, if maybe you're a freelance content writer and you don't have a budget for these other tools, it's really easy to set up. As I said, you get his script and you go to the script editor in Google Sheets, just kind of paste it in and save it, and then it's pretty much what it says on the tin. I wanted to give it a shout out just because I really like these quick tools that you can use in Sheets that are very laser targeted in what they do. It's a really nice way to quickly collect questions. Again, I'll link to the tweet in the show notes for you.

Again, there's a little bitty thing I want to talk about. And I think that's maybe what the podcast is good for. I can just rain down on you some bits that I find interesting and why they're interesting during the week. And this was covered by SEO Roundtable. They do a really good job of covering things like the Webmaster Hangouts, and just combing through what Google is telling us. And there was a particular question that interested me around core web vitals and redirects. And the question was answered by John Mueller. And the question was, "My website is a 100% core web vitals pass and all URLs are core web vital valid. Now I want to restructure my site. Restructuring, I mean to say I want to change the URL to a better SEO friendly URL. So my first query is, now I have already changed the URL, will the core web vital metrics or whatever that exists for my pages will be passed to the redirected URL?"

So they are essentially asking that they've got their core web vitals passed for URL, if they redirect that to another URL, is that score packaged up and taken with them? John Mueller from Google answered, "My understanding is that they, core web vital metrics, would be redirected. Like any other signal from search, if we see a redirect, then we would forward the signals that we have and apply them to the new URL. And it doesn't matter so much if the URL looks different, it's more that you're moving from one URL to another."

So a couple of things, why I've mentioned this, why I think it's interesting is, firstly, it makes sense because we know that the core web vitals score that is going to be used for ranking is taken from the Chrome user experience report, the crux report, which is the field data collected from real users, and we know that this takes around 30 days or so to get these scores. So it makes sense that if there is a new URL, Google will not have any data on that. There'll be no data points. So rather than just ignoring it if they see a redirect, it makes sense at least in the short term to transfer over that data, that information and put that into ranking, at least, my guess would be, until they've collected that data on the new destination URL. Because, again, logically it doesn't make sense for a page to be ranking or have a page experience influence based on an old URL. It doesn't exist anymore.

A comment on the Search Engine Roundtable said, "So if a malicious actor wants to hurt a website, they just need to cook up a terrible site for a while and then redirect it at the target." And then some bad language about how it's stupid. Just some offensive stuff. That's always been an interesting point with things like, if you have a site or domain that gets a penalty, for instance, does that penalty pass over a redirect? You would expect obviously that the penalty wouldn't be wiped out over a redirect because you could otherwise just buy a new domain, redirect the old one and you're good. But you would also expect not for any penalty to be passed over the redirect because it would make it very easy then to set up a domain, spam them, get them a penalty, and then just redirect that to a competitor.

So in that case when we're talking about Google penalties, what I understand to happen, and through some limited testing, what appears to be happening is that it's further down the chain and that those links or that domain is just devalued. It's like a zero rather than a negative. Meaning, if you 301 it to a new URL, you're not going to get any benefit because it's at zero. And the same actually if you then point at a competitor, it's not going to have any negative impact because it's just a zero if you like. Well, this actually makes less sense, this core web vitals thing, because if you have an existing URL that is not new and someone redirects a URL to you that is, say, very bad in terms of core web vitals, I can't see any logic as to why Google would start measuring the core web vitals for a page that used to exist and is now redirecting when they've always had the data for the actual URL. So personally, I don't think that's going to be a problem and this person's got the wrong end of the stick.

It's an interesting point, anyway, just to think about signals that aren't being passed through redirects. So this is more than just links. When we think about redirects, or at least I certainly think mainly about what everyone would call link equity being passed over URLs. So this is quite interesting because it's, again, a definite sign from Google that there are other types of metric that are being passed along redirects.

At the beginning of last week on the Google, The Keyword blog, we had another post about another breakthrough Google had. They've been having lots of breakthroughs recently and they are kind enough to tell us about them and blog about them, and this one is called LaMDA. And Google says, "It's our breakthrough in conversation technology." So it's written by Eli Collins, the VP of product management. Again, I'm just going to do what I did when we spoke about MUM, which is I'm going to pick out and read a little bit about the post to give you a background and then just talk about why this could be interesting through the lens of SEOs and people working in search marketing.

So the post says, "We've always had a soft spot for language at Google. Early on, we set out to translate the web. More recently, we've invented machine learning techniques that help us better grasp the intent of search queries." Much like actually how we were just talking about when we looked at that new type of SERP result, which looked at the intent that a page also answers. Anyway, the post goes on to say, "Over time, our advances in these and other areas have made it easier and easier to organise and access the heaps of information conveyed by the written and spoken word. But there's always room for improvement. Language is remarkably nuanced and adaptable. It can be literal or figurative, flowery or plain, inventive or informational. That versatility makes language one of humanity's greatest tools and one of computer science's most difficult puzzles.

LaMDA, our latest research breakthrough, adds pieces to one of the most tantalising sections of that puzzle conversation. While conversations tend to revolve around specific topics, their open-ended nature means they can start in one place and then end up somewhere completely different. A chat with a friend about a TV show could evolve into a discussion about the country, whether the show was filmed before settling on a debate about that country's best regional cuisine. That meandering quality can quickly stump modern conversational agents, commonly known as chatbots, which tend to follow narrow predefined paths. But LaMDA, short for Language Model for Dialogue Applications, can engage in a free flowing way about a seemingly endless number of topics. An ability we think could unlock more natural ways of interacting with technology and entirely new categories of helpful applications.

LaMDA's conversational skills have been years in the making. Like many recent language models, including BERT and GPT-3, it's built on Transformer, a neural network architecture that Google research invented and open-sourced in 2017. That architecture produces a model that can be trained to read many words, a sentence or paragraph, for example, pay attention to how those words relate to one another, and then predict what words it thinks will come next.

But unlike most other language models, LaMDA was trained on dialogue. During its training it picked up on several of the nuances that distinguish open-ended conversation from other forms of language. One of those nuances is, basically, does the response to a given conversation or context make sense. For instance, if someone says, "I just started taking guitar lessons," you might expect another person to respond with something like, "How exciting. My mum has a vintage Martin that she loves to play." That response makes sense given the initial statement. But sensibleness isn't the only thing that makes a good response, after all, the phrase “that's nice” is a sensible response to nearly any statement, much in the way, I don't know, is a sensible response to most questions. Satisfying responses also tend to be specific by relating clearly to the context of the conversation. In the example, the response is sensible and specific.

LaMDA built an earlier Google research published in 2020 that showed Transformer based language models trained on dialogue could learn to talk about virtually anything. Since then, we've also found that once trained, LaMDA can be fine-tuned to significantly improve the sensibleness and specificity of its responses. And then they go on to finish just with their responsibility, kind of first AI principles, which we mentioned before when we talked about MUM.

Why I think this is, again, interesting is this is taking... I think this is the cornerstone to the side step away from typing things into a search engine basically. We've had several conversations. We've had people on the podcast and it's been a very long narrative over a long time about this journey away from 10 blue links. And we talked about Universal Search and Google becoming this answer engine. But that's all encased in this slight restriction of you need to type in the correct search query basically. And again, this is what we spoke about with MUM in the last episodes (episode 112). This is something Google, again, one of the main problems, which is, I guess it's quite low fidelity of input.

So MUM was about answering this multisearch question, right? Because if you have a conversation, apart from the examples they were giving for MUM was around hiking one mountain after you'd hiked another, and the barrage of questions that will come from that and the context of the comparison. And this LaMDA is then talking about how you actually navigate your way through these topics. People quite often use the expression Google-fu. It's basically how good people are at understanding how to get the best from a search engine, which is a skill that is very different from the conversational and learning skills that we have as humans talking to each other which are quite high fidelity, comparatively at least.

We've talked again about how that search is changing, but this, I think, LaMDA is then going to open us up to not only have access to these answers and have access to these answers that are multifaceted in terms of, okay, well, I don't need to ask you the next three questions, you're just going to understand what I'm trying to understand and explain it to me, but how we actually interact. So rather than being very blinkered, I've got to type this into a search box - again, this is something that I see could be used with Google Home. So you are just having a conversation and it's got that in the background, and it's got MUM in the background and you're being fed these answers.

Again, huge questions about how, if that's the case, where the actual content producers end up, where the knowledge producers end up, because we're very quickly heading to this scenario where Google, if they are understanding the information, they've got technology like MUM to actually produce the natural language answer, and they've got technology like LaMDA to put that in a conversational basis, where does that leave web pages? I don't know is the short answer for now. But it's not hard to imagine a future where you are just having a very human-like conversation with a machine to get those answers. How well that will pan out, I don't know. Again, we've seen even with what should be simpler tasks, like featured snippets, sometimes Google gets them very, very wrong, sometimes knowledge in the knowledge graph is wrong, but certainly really interesting to think about when we're long-term planning our SEO.

That's all we've got time for in this episode. And guess what? I will, of course, be back in one week's time, which will be Monday the 7th of June. I hope you've enjoyed this episode. Please subscribe if you are enjoying the podcast. Tell a friend all that lovely stuff, and I hope you have a lovely, lovely week.

More from the blog