In this week's episode Mark Williams-Cook talks with special guest and UX...
Or get it on:
Mark Williams-Cook will be talking about:
Google Quality Rater Guidelines: Small changes in these guidelines show a shift to Google looking to perform better at perceptual tasks.
ECPC changes: Google Ads brings maximise conversion value to manual bidding
Nofollow: A discussion on the interesting side effects of the recent introduction of rel=ugc and rel=sponsored tags
Show note links:
Google Quality Rater guidelines: https://static.googleusercontent.com/media/guidelines.raterhub.com/en//searchqualityevaluatorguidelines.pdf
Google's announcement on nofollow tags: https://webmasters.googleblog.com/2019/09/evolving-nofollow-new-ways-to-identify.html
Webmaster hangout with John Mueller answering Lily Ray's question on nofollow usage: https://www.youtube.com/watch?v=cXbWuQQp81A&feature=youtu.be&t=1453
Welcome to Episode 27 of the Search with Candour podcast! Recorded on Friday 13th of September. My name is Mark Williams-Cook and I'll be taking you through today some Google quality rater guidelines updates and updates to enhance cost per click within Google ads and I want to talk to you about the nofollow tag updates everyone's been talking about it. Yes, the end result is, you probably don't have to do anything but I think there’s some interesting side effects from this change that are worth thinking about.
Last week we saw the Google quality rater guidelines were updated, I think it's on the 5th of September; if you haven't come across the Google quality rater guidelines, it's now publicly available, 167 page document, you can get hold of our link to it in our show notes at search.withcandour.co.uk and this document is what Google provides their manual quality raters with. It is a framework, a set of guides for how they should rate pages.
Now, the manual rating system isn't a secret anymore, I guess it was never really a secret but Google certainly weren't very forthcoming with what they were, with what they were doing. So we've known now for actually for many years Google state they have around about 10,000 people working as manual quality raters and the role of these quality raters is really quite misunderstood by some people. So what Google say they're using this information for is not to directly impact rankings, so if a whole bunch of quality raters rated a particular site poorly, this isn't gonna have any direct impact on the rankings. What Google is using quality raters for, is to test their algorithm, so they're looking at the algorithm outputs; so they're saying okay we've run these sites for algorithm, we think they're this good, this good and they're essentially having those results manually checked, if they like, so they can see if their system their algorithm as a whole is producing, giving the output that they want.
The manual raters work on a temporary basis, so I think the longest you're allowed to work as a manual rater for is three months - I'm not sure if that's exactly correct, maybe someone can correct me if they've worked as a manual rater - and they can work from home and they're essentially presented with a system that will show them pages from the web and they have to follow the quality rater guidelines to assign, to assign labels to them as to how good the site is and how well matches a certain query. The update now, because, because the quality rater guidelines don't directly impact rankings, I'm not going to go into depth about the changes that were made to the actual document you can have a look at that yourself, lots of people have written about that.
I wanted to comment on the type of changes that are being made. So the raters now are being asked to do what I considered to be very perceptual tasks, so rather than just saying does this page, how closely does this page match this keyword or you know trying to answer the question, should this page rank for this key phrase. The rater guidelines are looking a lot more closely at the overall reputation of the content creator or the publisher and there are questions such as, asking the manual rater to investigate the long-term reputation of the publisher. There's questions in there to make them think about the specific author and whether they have expertise on that subject and even Google reminding the quality manual raters that content such as satire and parody is still relevant can still get the highest ratings even though it doesn't maybe have a practical use and all of these changes I think just show what Google's trying to achieve with their machine learning AI approach, which is building up this more in-depth understanding of the web and who's publishing content.
So if you haven't read the document, if you're working in SEO, it's definitely worth reading, just because it's essentially what Google is aiming to achieve with the algorithm, so it's a really good document to read. As I say, I'll drop a link to it on the podcast show notes which you can get at search.withcandour.co.uk.
There are some new options for ECPC or enhanced cost-per-click, as it's known within google ads. This was as far as I can see discovered by Scott Clarke who tweeted ‘hey this is cool; manual CPC with enhanced cost-per-click on conversion value’ and he tweets a screenshot now of the Google Ads interface and this new option. So this new option we discussed a couple of episodes ago with Rob, about Google rolling out a maximise for conversion value, smart bidding strategy as one of their automated bidding strategies that they're pushing everyone really hard to try and use and Scott's tweeted this screenshot, where now when you're selecting manual cost per click and you want to tick the box that says help increase conversions with enhanced CPC, there are now two sub options to this; one being optimised for conversions and optimised for conversion value. So historically, previously, the enhanced CPC - which can be really helpful - was just optimising for conversions. So the enhanced CBC means you're allowing Google to sometimes bid extra on a click, if their data suggests that that user, in this instance, was more likely to convert.
So what you were doing by default, previously, was optimising for the number of conversions and adjusting the bid accordingly. So what Google's allowing you to do now, is they've obviously decided it's helpful to be able to use enhanced CPC, ECPC but with the conversion value. So they're going to let you optimise with manual cost per click and have this extra bit of intelligence as to when to increase quick value, if the conversion they think is going to be of higher value, so there's those two options now. And the screenshot shows whenever you are selecting this option, you obviously you've got manual cost per click selected and Google has a little warning icon and says ‘setting bids manually may result in’ and then in bold ‘lower performance. Use smart bidding to help improve results by using more signals to optimise your bids’ and I've obviously got some strong opinions about this message, I mean the wording setting bids manually may result in lower performance that's a really you know weak statement because of course, it may result, anything may result in lower performance. In truth, setting you know, setting smart bidding may result in lower performance, which is something that we consistently found it has and you know you only have to look online to see that general consensus is true, which is amongst PPC professionals and I'm well aware that it may look from the outside, that there is a bias of the PPC professionals wanting it to be manually managed but the facts, the objective fact is that in a lot of instances you will get better results, at the moment, maybe not so in the future, using manual CPC over Google's smart bidding.
So this is, I think still a push, a nudge we're seeing towards getting people ready for when Google remove manual CPC, manual bid management. it was actually something I wish I'd recorded now; so the couple of episodes ago, when Rob and I were speaking about some of the changes to Google Ads, after we’d stopped recording we had a discussion and Rob was talking about his thoughts on when he thinks Google is actually just gonna remove manual cost per click and and force everyone to use their automated bidding strategies. The thought really interested me and I ran a poll last week on Twitter, just asking the community that's connected to me what they think, when Google would remove manual bidding and third came back saying they think they will remove it within one to two years and another third of respondents, which I think may be a little bit naive, say that they don't think Google will ever remove manual CPC, which I don't think is true. I think it's too much in Google's favor for them to not do that, I think it is something that will eventually happen.
But, so at the moment you've got this new option, so if you go back, if you are a site running campaigns that the main thing you're looking to do is optimise conversion value and you are doing manual CPC and you're using ECPC, it might be worth going back now and looking because you'll have this option to change for maximise to optimise for conversion value.
Another thing while we're talking about Google Ads, is Rob and I on several previous episodes, have spoken about the change Google made a long time ago now from changing exact match keywords to what they call closed variants. Everyone still, that I speak to call it exact match but it is this closed variant which means Google will try and match your keywords but if they think it's really similar, they'll still trigger your ad and I wanted to highlight a particular example that we came across this week which was we are using closed variants, so we'll call it exact match with Google for a client that is a cafe, and we were bidding on the term, the geographic term, and then cafe and when we started going through the search terms that triggered this ad, we found that Google had determined that it was relevant to show the ad for the same search but including the word bar, instead of cafe. Now, to me, those are two very different things you know if you say to someone hey let's go to the cafe and then you take them to a bar, that's probably not what they're expecting because there are two very different places. So, we've actually had to go back again and make some changes to this campaign to make sure Google doesn't do that. So I can't stress this enough if you are using closed variant / exact match, do go back and review what's triggering those keywords.
Google has added two new types of tag to their nofollow attribute which is something they announced on the Google Webmaster blog on the 10th of September. So I'm going to read out some of the posts to you, just to get the official line across, and then I want to talk about some of, really only focus on the side effects of this happening rather than what you need to do because to summarise, you don't need to do anything. Google's been quite clear about that; there's not going to be any major ranking changes on your site and a lot of people have been covering this with, I don't know why, loads of interest, I don't think it's particularly you know, you're not going to be making a business case as to why you should go back and change your whole site to fall in line with this but I think it's interesting.
So the blog post says ‘Nearly 15 years ago, the nofollow attribute was introduced as a means to help fight comment spam..’ and so this was talking about Google relying at the time, 10-15 years ago, very heavily on the PageRank part of their algorithm - it's what made Google unique in a way as a search engine and that's how the web worked, everyone used to link to each other especially in a lot of user generated content places like forums and once people got the sense that this is what was making sites to rank, places especially like blogs were targeted. We still get them now nowadays, loads where they're just leaving automated fake comments because they can drop links in them. So Google provide them with this no follow attribute and so going on the post it said it also quickly became one of Google's recommended methods for flagging, advertising related or sponsored links so again, to take that in in context when people realise that, hey these links these links are helping rankings, obviously people wanted to start buying links from websites and their whole economies website set up like text link ads to do this, to facilitate the buying and selling of links and this was contrary to why Google found links to be a good metric to look at because if links are placed with, in mind, the the webmaster decides that that content is good or useful then that method of analysis works. If links are placed because spam has put them there or because money is changing hands, then it breaks this, a voting method if you like, so linking to a site is like voting for it but the vote is invalid if someone just paid you two votes. Google goes on to say ‘the web has evolved since nofollow was introduced in 2005 and it's time for nofollow to evolve as well. Today we're announcing two new link attributes that provide webmasters with additional ways to identify to google search, the nature of particular links, these along with nofollow are summarised below.’
So Google is releasing REL equals sponsored, use the sponsored attribute they say to identify links on your site that were created as part of advertising sponsorships or other compensation agreements. So if money is changing hands, Google wants you to tag the link with rel equals sponsored. They're also introducing a tag of rel equals UGC - UGC stands for user-generated content and the UGC attribute value is recommended for links within user-generated content such as comments and forum posts. And because these were originally what nofollow was used for, they've gone on to then specify what you should use nofollow for. So they say nofollow this use this attribute for cases where you want to link to a page but don't want to imply any type of endorsement including passing along ranking credit to another page.
So this might be I guess, as an example, if you write an article about a site that was doing something naughty, say with people's information, and you wanted to link to the site so people could see it but you wanted to make it clear that you weren’t endorsing them that might be a case where you use rel nofollow. The post goes on to say: when nofollow was introduced, Google would not count any link mark this way as a signal to use within our search algorithms, this has now changed. All the link attributes sponsored UGC and nofollow are treated as hints about which links to consider or exclude within search or use these hints along with other signals as a way to better understand how to appropriately analyse and use links within our systems. Why not completely ignore such links, as had been the case with nofollow, links contain valuable information that can help us improve search such as how the word within links describe content they point out looking at all the links we encounter can also help us better understand unnatural linking patterns by shifting to a hint model we no longer lose this important information, while still allowing site owners to indicate that some links shouldn't be given the weight of the first party endorsement.
This I think is really interesting, this, so there's a couple of things that are happening here. Firstly from my discussions with other SEOs, we have, I think for the last few years already seen some consistent ranking benefit and dare I say it, too when we have got certain types of nofollow links. Now that's not something I've ever bothered blogging about or including in talks or anything like that because essentially I don't have enough data to prove that point because it's a point that's contrary to everything Google was saying at the time, and it's for someone working even you know a massive agency you just don't have enough data points to make that kind of statement. But one thing I have noticed, especially with local box rankings is that nofollow links did seem to be having an impact so we might work on a site that has had stable rankings for quite a long while and we may get, as part of an effort, a few dozen nofollow links and we'll already start to see the rankings improve. And what Google saying in this statement is that basically, exactly that, which is in some cases they may choose to start listening to the signals of nofollow links.
My guess here is that they will still ignore links that are marked as nofollow, where they can identify them as obvious sponsored links or obvious comment spam, but there are lots of cases now, where Wikipedia is a commonly used example where if you have a link in Wikipedia all the links there are nofollow but that's quite heavily moderated so you could use it as a source of, okay this site must have something going for it because it's managed to get a page on Wikipedia and this link has lasted on Wikipedia for a year without someone removing it. The other thing is that the web as Google says has changed quite considerably, in that forums used to be kind of a bigger thing than they were now and a lot of that activity of users linking to content that they like has moved to within apps, to within walled gardens so before people would share loads of website links on forums and a lot that activities and moves on to places like Facebook, like Twitter which makes it harder for a search engine like Google to work out what's going on. It certainly doesn't fit into their PageRank model to use those links, which are all nofollow
Anyway, this is backed up as well by Danny Sullivan on Twitter responding to a statement about how a lot of newspaper sites just blanket nofollow everything and Danny said ‘that's a big advantage to this change over the years, some publishers have simply gone full nofollow without seeming to give much thought about why, now we can see and use those links. In most cases as said, it's not going to change anything.’ So this gives Google the ability I think to go can quite easily identify, I believe you know editorial news websites and I think we’ll probably see the value of nofollow links from those kinds of publications increase. So that's one side effect of this change.
The other interesting thing is that there's a couple of uses that SEOs have put nofollow to, which may now change.So I will link to a webmaster hangout session that was done live, not long ago, John Mueller was asked the question about big ecommerce sites. So big ecommerce sites can cause all kinds of technical SEO issues when it comes to categorisation of products and then within those categories, if you have lots of filters and facets that generate very similar pages, with different urls, there's different, a few different ways that SEOs can handle that from using no index to using canonical tags, depending on what type of page is created and how different it is.
So I've spoken previously about the canonical tag is only a hint to Google and if the two pages that you or two or more pages, that you use the canonical tag on are actually quite different - which can happen with filters and facets, Google will actually choose to ignore those tags and again, especially with larger ecomm sites, if you have thousands of possible variations that are easily crawlable by Google sometimes it's better to start applying no Index in these instances. so that's a whole other kind of area there to talk about, but what was interesting in this hangout is, Lily Ray asked John Mueller how, what’s Google's advice, what’s is the best practice method to to handle this situation? and John gave a kind of a classic ‘depends’ SEO reply and saying that actually there isn't much in the way of specific guidance on this right now.
One of the things he did mention was about using rel equals nofollow links internally to try and give Google a hint as to which sets of attributes, facet, filters to ignore and that's something that SEOs have done a long time for a long time now, which is use nofollow on internal links, to try and guide Google to the most important pages only. So when we were told, nofollow is not used for link discovery, not is not used for crawling, it made sense when you've got these really big sites and all these, all these ways that a bot can crawl your site, but you're only interested in a small slice of that to use nofollow and that's something John suggests in that reply, although it's something Google says in the other documentation maybe isn't the best approach. So there isn't actually any clear guidance on that and right now, the other thing to consider is when you are using no index on pages, something that Google told us a while ago, is that if a page is not indexed it will still be crawled but, it can still be crawled, but the links on that page will be treated as nofollow links and this stopped. There were several kinds of black hat tricks about having pages no index, not showing, but then being able to put certain anchor text on them and get benefit from that. So I think that's one of the reasons Google use this nofollow on, no index pages.
I assume that may change now, with this announcement or that I guess they made keep ignoring it where pages marked as no index. But the, the update notes also and blog posts go on to say that webmasters need to make sure they're not relying on nofollow to make sure the pages are not getting indexed, which is something in fairness Google never recommended you should do - you should be using the no index tag. But it may well impact your site, if you are using nofollow to kind of craft where you want the bots to go. so I think it is interesting how we're gonna see this roll out, I think the reason Google's done this, I think the reason why we have these new attributes for nofollow is actually more to do with Google Having tagged data for machine learning algorithms - I think that's why they've done this, I think they're putting it out to the community to say, hey if you tagged these links what's going to happen is, machine learning systems will then be able to say okay this is what a sponsored link looks like, this is what UGC link looks like and actually, they can then apply those learnings to other sites, even if they haven't, even if they haven't used those tags themselves, even if they're not even using nofollow, Google get a much better understanding of how these links appear, what they look like and come a lot closer and maybe even exceed kind of human perception on what those links are by just analysing the page.
As I said at the top, there's no need to change anything urgently yourself - there's only a very small amount of total science of the web that are kind of actively using nofollow correctly so it would not make any sense for Google to try and penalise or demote sites that don't step change with this. I've seen a few SEO s actually say, look you know while there's no direct benefit to us to do this, maybe there's something we should do because it will help make the web a better place if Google can understand these types of links. I think that's something I'll leave you guys to think about; whether we should be helping Google for free and whether it'll make will make the web a better place.
So we are already out of time and it's been great! I will be back next on the 23rd of September, please check out the show notes links, if you want any more information that app search.withcandour.co.uk otherwise, I'll catch you in one week's time. I hope you have a great week and that you all get the rankings that you deserve!
In this week's episode Mark Williams-Cook talks with special guest and UX...
In this week's episode Mark Williams-Cook and Rob Lewis talk about some...