Episode 120: DirectApply schema, Facebook tracking, GoDaddy listings and Google responding to fresh queries
In this episode, you will hear Mark Williams-Cook talking about DirectApply...
Or get it on:
In this episode, you will hear Mark Williams-Cook talking about:
301 redirects: How specific advice from Google can help you optimise site redirect speed
Site operators: More documentation from Google highlighting the limitations of site operators
CTR tests: SearchPilot shows how logical changes sometimes result in negative results
Smart Bidding updates: Google Ads Smart Bidding rolls out changes to bid strategies.
Search operators update: https://developers.google.com/search/docs/advanced/debug/search-operators/overview
SearchPilot with video CTR study https://www.searchpilot.com/resources/case-studies/seo-split-test-lessons-adding-with-video-to-title-tag/
Previous SEO split tests from SearchPilot https://withcandour.co.uk/blog/episode-70-tab-surprises-google-ctrs-and-all-in-one-security-flaws
Ref: ep 98 - “iOS14, ATT and Google passage ranking” https://withcandour.co.uk/blog/episode-98-ios14-att-and-google-passage-ranking
Google Smart Bidding update https://withcandour.co.uk/blog/episode-112-desktop-page-experience-new-google-ads-features-and-mu
MC: Welcome to episode 121 of the Search with Candour Podcast recorded on Friday, the 23rd of July, 2021. My name is Mark Williams-Cook and today I'm going to talk to you about Google search operators, how long we should be leaving 301 redirects in place and interesting SEO split tests around videos and Elizabeth News on the PVC side about the previously announced target CPA and target ROAS bidding strategies being sunset. Before we kick off, I would love to tell you this podcast is sponsored by the wonderful people at Sitebulb. Sitebulb, if you haven't heard of it is a desktop based SEO auditing tool for Windows and Mac. It's been around for a while now, and it's really had a big impact on the SEO community. It's an incredible tool for auditing your sites. They've recently released their newest version, which includes a really nice ability to check your core web vitals at scale.
So, before this, there was a few other ways you could do this, including using kind of command line interfaces to go through all of your pages, but Sitebulb takes care of all of this for you. Of course, we've got the page experience updates still rolling out at the moment. So, a really good time to do this. With Sitebulb, you can just set it crawling your site, and it will do these lab tests on every single page, group the results together for you and give you the feedback you need. If you're listening to this podcast, there's a special offer for Search with Candour listeners. If you go to sitebulb.com/swc, that sitebulb.com/swc, you'll get an extended 60 day trial of the software, no need to put your credit card details or anything like that. So, you're completely free to give it a go and see if you like it, which I'm sure you will. At sitebulb.com/swc.
With search operators, there's something that have really interested me over the years of doing SEO. And for those of you, maybe haven't used them or don't know, search operators are the kind of commands you can put into Google to retrieve specific results. So, one of the most well-known ones for SEO is using site, colon, and then you can put in a domain or a specific page or a specific sub folder. And what you're doing is then the search term that comes after that you're limiting your Google search to that site or part of sub site or subdomain that you've specified. So, it can be really useful. I've used it to do some real quick and dirty checks around if I've got duplicate pages or cannibalisation in content, and you can use them in combination. So, there's other ones like entitle, for instance. So, you can search our whole website or that specific site or domain for a specific phrase and a total, for instance. There's loads of interesting site operators and lots you can use them for.
And Gary did a tweet this week. So, Gary from Google in its typical kind of fashion saying if you don't have anything better to do Lizzie Harvey, and I just published a set of docs about how certain search operators cite cash related source image size work and also their limitations. And we'll put a link to this in the show notes at search.withcandour.co.uk, which would take you to this Google search operators guide. And it's mainly really some technical information about how they work and the kinds of things they retrieve. It's not an exhaustive list of all the operators, but the reason I'm mentioning it in the podcast is there's a couple of bits that interested me there. So, the thread that Gary posted on Twitter, which again, I'll also link to, if you're an SEO, that's kind of not on Twitter is a really good place to get information and you will learn a lot from the discussions that kind of go on, which is why, if you listened to his podcast a while, you'll see a lot of actually the news we talk about comes from various conversations.
Now, the conversations here that interested me particularly were around people had gone off and read this guide that Gary had posted and they were then coming up with problems. So, someone said, "Hey Gary, can you please help with image size code on 1,200 by 800 command? I tried, but it didn't return URLs with specified image size." And he says, "Also tried this dah, dah, dah, am I doing it wrong?" And Gary's reply is, "No. You're probably not doing anything wrong, but all the search operators are heavily affected by our systems retrieval limits and on big sites, this is less visible." For example, he does the same image size operator query on Wikipedia and obviously you get loads of results.
So, what Gary is saying there, firstly, and what's actually highlighted at the top of this document. I'll read that first. So, if you go and look at this overview of Google search operators, there's like a little blue box with a star in that's highlighted. And it says, because search operators are bound by indexing and retrieval limits, the URL inspection tool in Search Console is more reliable for debugging purposes. So, what they're trying to say is, especially if you've got a smaller and maybe less popular website that isn't as well called index by Google that these commands aren't always going to return all the results. So, if you're looking for specific things to debug specific problems, specific things you're looking to optimise, it may be that they fall on some of the URLs that haven't been retrieved yet. So, they're not going to come back by these operators.
These operators are still bound by what Google's crawling. So, that's an interesting thing and Google's recommending obviously you use a Google Search Console there. It's actually a conversation that we've had a couple of times, I think now in the podcast, because the most interesting example, and I can't say this enough because every time I mention it, it seems to surprise some people was when I've used the site colon command and I said, I know a lot of people use this and I still use it for, for some things in SEO, one thing I think lots of people have noticed over the years is that using that site, colon command, you can't get a good idea of Google's search coverage sort of how many pages are indexed. We've seen wildly fluctuating numbers on the same site done in the same day or same week with that site command and certainly they don't add up to the kind of numbers we're seeing in Search Console.
So, we know that if you've got access to Search Console, it's definitely the best place to look at that index coverage. And it can be helpful if you don't have access. If you're looking at competitors to do those kinds of commands, to get a rough idea of the size of the site or the size of the indexable site, at least, but in some cases it's way off. The most interesting thing though, was I've seen people trying to use the site operator to work out if pages are or are not indexed. And something that I stumbled upon and we got confirmed kindly by John Mueller at Google is that the site command will return pages that Google hasn't indexed, or at least by hasn't indexed, I mean, it's not going to return in regular searches.
So, I found this myself when we were having issues on a website where Google had decided to canonicalise pages, where they were meant to be the canonical version, but Google had said, "No, actually we're going to pick this version." And when this problem came about, because the pages were never appearing in searches, they weren't ranking the viewers were never there. However, when I did site commands, I could easily find those URLs and we could confirm then if we went into Search Console and did an inspection of that URL that Google had said, it's been crawled, but no, it's not indexed because we've decided it's a non-canonical URL. And actually this, when I looked into this even more, I found several examples of this in Search Console, where Google was saying a URL was not indexed. And then we can go and do a site operator search on the site and we could find that page.
So, you have to be really, really careful when you use these operators that you understand their limitations and you understand what they're used, where they're designed to be used for, and therefore in the gaps, what you should and should not be using them before, or at least using them for, with caveats. As I said, I still use it for a bunch of stuff and is helpful, but that's one particular example I thought I know catches a lot of people out. So, again, we'll link to the documentation and have a read through it and just make sure you're happy with all of that
Definitely now deep into territory of news that I would define as very interesting to SEO's and possibly no other people on earth. Gary has been very busy this week on Twitter and has tweeted something else that I think is worth sharing on the podcast about SEO, which is, he said, "Hands up, if you have asked us recently for how long you should keep redirects in place. I have a concrete answer now, at least one year, but try it." And then he says in brackets, "But try to keep them indefinitely if you can for your users." Okay. So, I think this is really interesting and worth talking about for a few reasons. Previously, Google has not given us a concrete answer on this. The advice has always been leave 301 redirects indefinitely, if you can forever as long as possible.
And that makes sense, firstly, from a Google point of view, whenever we're moving a page, if we're doing a permanent redirect, the idea is normally that the new page will kind of inherit all of the signals, the link equity, whatever it is that made the other page rank and take its place and hopefully rank at least as well. There is of course, the thought as well for users, which is that if they click on old links that have not been updated internal or well, hopefully not internal, but probably internal, but certainly external that they will end up in the correct place. Now that especially for small sites and in a small term short term, medium term way of thinking certainly presents no problems. Where it potentially presents problems and conversations I've had, is when you're dealing with maybe a very large site in the hundreds of thousands of pages, redirects have been set up because there maybe they did a whole replatforming or a rebrand.
So, we've got all these redirects set up and then a few years later, they go through a similar process, maybe a merger or something, and we're rebranded and everything's moved again. And over the years, you're starting to get to the point where this redirect list is becoming huge, right? It's becoming millions of URLs. And of course, whenever we do redirects, there's been ways that we can and can now that we couldn't previously optimise this. So, going from maybe database driven redirects to redirects on the edge, to looking at the most efficient way to write the rules, to reduce latency. But this is what it comes back to, which is that very large sets of redirects, not only give you kind of some technical debt, really some technical overhead to manage over the years, but also they can cause latency. Now, what's interesting about this kind of concrete answer from Google is they've been very specific that if now you've had this redirect in place for one year, at least one year, so URL A is redirecting to URL B.
Even if you remove that redirect pointed elsewhere, it doesn't matter. Whatever happens, those signals have been transferred, migrated, whatever you'd like to call it to URL B. So, nothing else from that point will change that meaning you could of course delete the redirect. As Gary says, at the end of his tweet, he said, "But try keeping them indefinitely if you can, for your users." So, this kind of just adds maybe an additional step onto these larger, more complicated sites when it comes to redirect and the length of them, which is that it may be worth doing periodic scans of redirects that you know exist to see if they are still being used. So, for instance, say we had a 100,000 redirects and we scan them once a year and we found for 15,000 of those redirects. We have not had a single human visitor come through them in the last 12 months.
I would say, they're fairly good candidates then to just delete those redirects. You're not going to lose anything from doing that visitor wise. We've been told now how the kind of algorithm is counting things that that benefit is transferred over. So, there's no point in having this overhead. I think it's quite an interesting technical point, especially, we can't take every single redirect with us over 5, 10, 15, 20 years. I think it's a really interesting point for those bigger sites to have in your kind of toolbox to think about strategically. And while we're in the mid point of the show, I would like to introduce our sponsor Wix, who has this update for you? URL customisation on Wix is now available on product pages. You can now customise URL path prefixes or even create a flat URL structure if that floats your boat.
Plus Wix automatically takes care of creating 301 redirects for all impacted URLs. Full rollout coming soon, also fresh off the press, but log reports get an easy understanding of how bots are crawling your site without any complicated setup, right inside of Wix. There's so much more you can do with Wix. You can now add dynamic structured data, upload redirects in bulk, including error, notification and warnings, and fully customisable meta-tags and the robot's text file. You can get instant indexing of your homepage on Google while a direct partnership with Google My Business lets you manage new and existing businesses things right from the Wix dashboard. Visit wix.com/seo to learn more. SearchPilot have made it back onto the Search with Candour blog. We covered a interesting split test they did back in episode 70 about taking content out of tabs. And another one of those split tests caught my eye this week, which was around title tags.
And I like it because it's counter-intuitive at least it was for me. And certainly based on their initial, what do you think will happen poll, it looks like it was counter-intuitive to a lot of people. So, SearchPilot set up these SEO split tests and they give a little bit of information about how they run these kinds of case studies. And they specifically look at they're trying to detect changes in performance of various pages they're making compared to a control so that they know that the measured effect was not caused by seasonality site-wide changes, Google algorithm updates, competitive changes or other external impact. They use statistical analysis to compare the actual outcome to a forecast. And this comes with a confidence level so they can be certain of how real the effect is and it's not chance. And they also measure the impact on organic traffic in order to capture changes to rankings and or the click-through rate.
So, there's some more information on that on their blog post we'll link to it, of course, search.withcandour.co.uk. But this split test they did was around adding with video to title tags. So, they ran a Twitter poll saying, "On a publisher's website, we added with video to all of the article titles that contain the video within the post. In the hopes of boosting the click through rate, what impact do you think this change had on organic traffic?" And of the answers they had 69% say positive, they think will have a positive impact, 12.5% said negative and we had almost 19% saying it wouldn't make any difference. So, obviously a vast majority there almost 70% thinking this would have a positive impact on click-through. And to be honest, I think that's what I would have said as well. And basically they found the opposite.
So, I'll just read through this case study for you. So, they said having video content as a compliment to article copy could be a great way to engage users and enhance that overall experience, particularly in industries where video content helps illustrate things like how-tos. So, certainly again, something we've talked about, how Google, especially for those how-to terms like surfacing video content from places of course like YouTube, right at the top of the SERPs. One of our customers immediate brand wanting to advertise this unique selling point in the search results by including in the title tag that the article had an accompanying video. We hypothesised that including a reference in the title to the fact that the article had both written and video content would make users more inclined to click through. We ran two iterations of this title, test one, adding video, and the next adding with video. Here's the first test, adding video and what it looked like in the search results.
So, they just give an example of saying how to knit to hat and then there's how to knit a hat (video) and the result of the test was basically significantly negative. So, they hypothesise that perhaps when users read just video in brackets, they took this as a sign that the page was solely video content as opposed to a contextual video within other written copy on the page. I think it's a really good hypothesis to come up with because we know if Google kind of works out that people want video content or only video content that anyway, they'll just plop video results, right at the top of the page. So, I think it's a really clever hypothesis they've made here. So, with that in mind, they iterated upon the tests and added with video to the title tag instead. In the second test, that hypothesis was indicating that a video was included within the content and not the sole form of medium included on the page could encourage users to click through more often.
Here's how the second test appeared in the results. And of course, exactly the same how to knit a hat, but the variant has with video in brackets. And as we opened with it actually had, again, a significantly negative impact on organic traffic. So, this is indicating that the articles have an accompanying video directly in search results was not impactful in the way we had anticipated. This test is yet another example of where testing a change, we had a logical basis for led to a surprising result. In this case, it's possible that adding length to the middle of the title tag truncated the brand name or other more relevant keywords, which could negatively impact CTR. It could also be that users landing from organic search for this particular industry place less value on video content than we thought, which may mean seeing that video in the title, even in the second iteration, deterred them from clicking through.
Test results like this can reveal gaps we may have otherwise missed and give hints to what users are most drawn to. And this case it raises bigger strategic question, does the outcome indicate that we may be better off making more written content instead of investing as much in video content? Of course there are multiple factors at play that influences kind of decision, but there's information gained by running these experiments. We were given more direction than we previously had. I think that's a fantastic experiment because yes, it absolutely goes against the very general best practice of video, good people like video. And I think both those thoughts that kind of came after that second test, which is that people might not want video is quite interesting. They may prefer written content.
Again, I think that's maybe hinted by the type of results Google is showing, but especially sometimes I try and avoid video if I know there's a specific answer I want, and I just want to be able to read it because I know it's going to be quicker than getting a video to load and then finding the correct place in that video if I can just control F on the page maybe. So, super interesting tests by SearchPilot. And again, just highlights that importance of if you are making or planning to do site-wide changes on your content that you need to be testing them first. Back in episode 112, we started to cover some of the changes that Google was making to smart bidding. And I've got an update for you. So, for those that are involved with PPC and especially Google ads, yesterday on July the 22nd, Google has posted some updates on how smart bidding strategies are being organised. They have said, "We previously announced changes to how smart bidding strategy is organised to help you choose the right one for your business.
You'll now see the following choices when you create new bid strategies for search campaigns. So, these new choices are one, maximise conversions will have an optional target CPA. And two, maximise conversion value will have an optional target ROAS. In the next few weeks, you'll no longer have the option of using the old target CPA or target ROAS bid strategies for standard campaigns. Instead, use the updated bid strategies by setting optional targets. This update only applies to campaign level strategies. Portfolio bid strategies will be updated next year. There'll be no impact to bidding behavior due to this update." That's bolded. "Use maximize conversions with a target CPA will have the same bidding behavior as target CPA. Likewise, using maximize conversion value with a target ROAS will have the same bidding behavior as target ROAS." And interestingly they've tailed off this post with, "All existing campaigns using target CPA or target ROAS will continue to run as usual.
We'll give advanced notice before automatically switching these old bid strategies to the new format in 2022." So, we've got at least six months or so before this happens. "This switch will not have any impact on bidding behavior." So, not a huge change, but they've provided this quite useful table, which I would review if you're managing these campaigns, which lists all of the performance goals. So, something like maximize conversions within a set budget is the performance goal. They name what the old bidding strategy would have been for that. So, in this case, it maximize conversions and then they list what the updated equivalent bid strategy is. So, in this case, it's maximized conversions, but no target CPA specified. So, I'd go through that list, just make sure you're aware of those tweaks. Again, we'll post the link at search.withcandour.co.uk.
And that's everything we've got time for in this episode. Thank you for listening. I'll be back of course, in one week's time as always with episode 122 on Monday, the 2nd of August racing through the year, I hope you've been enjoying the podcast. Please do tell a friend about it. Subscribe, share it if you like it and hope you all have a love week.
In this episode, you will hear Mark Williams-Cook talking about DirectApply...
In this episode, you will hear Mark Williams-Cook talking about Google...
Get in touch