Candour

Episode 47: Removing content from Google, Google Ads delivery and ISP data

Play this episode:

Or get it on:

What's in this episode?

In this episode, you'll hear Mark Williams-Cook and Rob Lewis talking about removing content from Google including the new tools within Google Search Console to handle content removal and Google Ads delivery. They will also discuss what IPS data is and why it's been removed from Google Analytics.

Show notes

New removals report in Search Console https://webmasters.googleblog.com/2020/01/new-removals-report-in-search-console.html

DMCA request tool https://www.google.com/webmasters/tools/dmca-dashboard?pli=1

Transcription

MC: Welcome to episode 47 of the Search with Candour podcast! Recorded on Friday the 7th of November 2020 (Mark means February, he’s quite ill! - Beckie) My name is Mark Williams-Cook and today I am lucky enough again to be joined by Mr. Rob Lewis.

RL: Hello.

MC: And today we're gonna be talking about removing content from Google search results, Google Ads delivery changes and ISP data.

Before we get going Rob firstly, really pleased I think it was the first one we've done together in 2020 and I'm really pleased, as you might be able to hear if you're listening I'm getting over at the tail end listen flu and my voice isn't quite working correctly yet, so I'm really pleased you're here Rob to talk about Google Ads.

RL: I think you might need to gurgle some TCP.

MC: Ok so let's start talking about... there is a new removals report in Google search console which is pretty interesting. There's a post that I'll link to in the show notes, which is search.withcandour.co.uk, which will take you to the official Google Webmaster blog and they say, on Tuesday 28th, they did a post called new removals report in Search Console. “We're happy to announce that we're launching a new version of the removals reports in search console which enables site owners to temporarily hide a page from appearing in Google search results. The new report also provides information on pages on your site that have been reported via other Google public tools.”

Two important things to pick out there before moving on is, Google's very specific and it always has been the case. They're saying this removal tool within Google Search Console allows you to temporarily hide a page or set of pages from appearing in Google search results.

So firstly this means you were only hiding the result from appearing, you're not actually removing that from the Google index. There are specific ways, I'll mention in a moment, that you need to go back to, to actually remove a page from the index. But with this tool, the difference is it allows you to do it pretty much instantly, so even if you set pages for instance as no index, that wouldn't have an immediate effect until the pages is crawled again but this tool allows you to just quickly remove results. It says temporarily because the idea behind this is you might want to remove a page and you then need to actually do something like set that page to no index to stop it appearing, because if you've got some pages you don't want appearing and you use this tool to “remove them”, all that'll happen is they'll temporarily be hidden and actually they'll come back unless you do something about it.

They go on to say, “there are different tools available for you to report and remove information from Google. In this post we’ll focus on three areas that will be part of the new search console report; temporary removals, outdated content, and safe search filtering quests.” So a temporary removal request is a way to remove specific content on your site from Google search results. For example, if you have a URL you need to take off Google search quickly, you should use this tool. I would give an example for this, probably the most common one is that I've seen a lot of websites have their development or staging sites indexed in Google, so that's when maybe a company's building a new website and they have a version online that they haven't actually password-protected, but is in just a different area, so it might be on like dev dot and then their domain name.com. Google is pretty good at discovering these and actually indexing them, so it can lead to a couple of problems such as firstly you may not actually want people poking around your unfinished development site, competitors or actually customers landing on it and getting confused and secondly, you may actually start creating yourself duplicate content problems in a way where you've got multiple versions of the same page, the old and new version in Google. So this is an example when you might want to quickly get these pages out of Google.

The post goes on to say, “A successful request lasts about six months which should be enough for you to find a permanent solution. You have two types of requests available; one is, ‘remove a temporary move URL’ will hide the URL from Google search results for about six months and clear the cached copy of the page or two, clear cache URL clears the cached page and wipes out the page description snippet in search results until the page is crawled again.”

So the first there which is temporary remove, I think it should be called temporary hide URL to describe what it does better but that's for the case where we just said maybe you had your development sites indexed and you don't want it indexed and the tool says, it will last about six months so it'll be hidden for about six months. So you've got a few months there to try and actually sort out a proper solution. so the proper solution, if you want things not included in Google’s index is to use the no index tag and that, can be used on a page or through the HTTP header and this tells search engines when they go to that page, not to index and not to include it in their index. One of the most common mistakes we see is developers using the robots.txt to exclude parts of the site, thinking the robots.txt is going to stop them from getting indexed and that's not actually true. So the robots.txt is what controls crawling, so pages shouldn't then be crawled by search engines, but if they're discovered, those pages, through other means it is still possible that they will be indexed and you may have seen some results in Google's times where it says that and the Meta Description isn't available and that's normally when a page in robot.txt has been indexed.

The other things to bear in mind is that your robots.txt is publicly viewable, so if you're trying to hide sensitive things with development sites behind it that can otherwise be accessed, someone can just look at that file read it and go to those pages. So definitely want to use the no-index tag and you can use this temporary removal tool to get the results quickly out of Google.

The clear cache might be useful, for instance, if you're like me and you are always plagued by spelling mistakes, so if you've got a spelling mistake in the Google cache that's showing prominently in the search results you could use this clear cache to flush it out, but you'd need the page to be crawled again then before it's going to appear in the results. But really really handy tools there.

The second one is outdated content. So these next two are actually really good insights into what's happening with your site that's been triggered through public tools. So the outdated content section provides information on removal requests made through the public remove outdated content tool, which can be used by anyone, not just site owners, to update search results showing information that is no longer present on a page. So this is a really helpful tool whereby you might be ranking for a search result because Google thinks something's on your page and it's not there anymore and somebody uses this tool to flag it, you can see those URLs now. Similarly, there is also now a tab for safe search filtering; so the safe search filtering section in Search Console shows a history of pages on your site that was reported by Google to users as adult content using the safe search suggestion tool. URLs submitted to this tool are reviewed and if Google feels this content should be filtered from Safe Search Results those URLs are tagged as adult content. So now you can actually see which of your URLs have been tagged as an adult content because they won't be appearing in the default safe search on results.

So they're the three sections now in the removals tool. The removals tool, if you log in to Google search console, should be under the index on the left-hand side, just on the sitemaps and you'll have those three tabs at the top. as the post kind of suggests there are lots of different ways to get content removed from Google, lots of public tools and one point of interest, I wanted to add on to the end of this, is I've recommended lots of people, websites, friends, clients, before to use the DMCA removal tool in Google, which I'll again I'll link to in the show notes. So the DMCA is the Digital Millennium Copyright Act, it's a tool that Google's legally obliged to review when you report to URLs that are copying your content. So if you have copyrighted content, you've written some content, you've got images for instance and they're appearing in search results, you can fill out this form that says here's my original content, here's the URL that's copying me, I'm signing this legal process to say that I am the copyright holder here and I've always recommended people do that because in my experience Google has responded quite quickly in removing those results from search and they'll remove them unless there's a counterclaim where someone says actually it is my content. But the other thing I found out last week is Google actually said, if a site has multiple upheld DMCA requests against it, where it’s said that the same sites had multiple people flag they're copying content - they have been investigating, they've been upheld, it's actually used as part of the ranking algorithm, I assume that means that site is going to rank worse, not better obviously if it is copying content.

I did have someone when I was talking about this tool earlier in the week on LinkedIn say, well is this gonna cause people to just flood loads of DMCA requests to sites to try and cause some trouble and one thing I did point out to them is, it's actually illegal to submit knowingly false DMCA claim, so if you're getting lots of these claims and having to counter them, it might actually be worth exploring legal options there. So I think that's all being handled quite well, it's another good addition to Google search console that will hopefully help us.

So, changes to ads delivery and Google ads is something we're gonna talk about now. I know from looking at your notes Rob, I saw last year Google announced it was phasing out its generated budget delivery.

RL: Yeah.

MC: And you raised this earlier with me in the week actually about these changes. I'm really pleased we're having this podcast because Rob did actually send me a note about this change and I've had a particularly busy couple of weeks and I haven't had time to read through what he said to me yet, so this is my first time now hearing the explanation about these changes.

RL: You make it sound like it was a polite note but in reality, it was an, oh wow it’s me, Mark you won't believe what Google has changed now. yeah so I think was back in August last year, Google announced it would phase out accelerated budget delivery and in case nobody knows the difference between the two delivery types - there's standard at delivery which is where Google staggers the ad delivery throughout the day and Google was in control of when those ad shows and Google decides when it thinks it's best to show them.

MC: So this is basically if you don't have a high enough daily budget for your ad to show basically all the time, Google would spread them out during the day. So if you had a £50 a day budget for something with thousands of searches a day that all your ads aren't shown between midnight and quarter past midnight.

RL: That's correct. Although in reality, even if you have a low budget Google may decide to spend it within the first few hours of the morning. It's just whatever Google decides essentially, so you can have no control over it.

MC: So is the most common answer to Google Ads clients saying “why can't I see my ad when I type such-and-such”

RL: ‘because Google’ and the other option, which was my favourite option, which I used to use in nearly every single campaign I ran was called, accelerated delivery, which is where I guess you could say the floodgates would constantly be open, ads would show all the time, as often as possible, until the budget is depleted, that's assuming that you have a budget cap set in place. It might be that you have a really high budget and you don't reach it but regardless the ads will always show whenever someone's searching for your keywords, assuming that you're bidding high enough.

MC: So can I ask for curiosity there, before you carry on, why would you pick accelerated delivery for almost all your campaigns over standard? Because if I remember correctly, standard delivery was the default by Google and I know we've discussed before there are lots of defaults of Google we're not fans of, but why would you choose advanced over standard?

RL: Okay, I'll try to answer this as quickly as possible because the answer is complicated. But basically I want the ads to show as often as possible and even if I had a limited budget, I would want that floodgate to be constantly open, open as wide as possible. My experience with standard delivery was Google would sometimes choose to stop or to cap ad impressions for no reason whatsoever and I would always get more traffic, I'd always be more likely to reach the budget I'd set if I'd set it to accelerated.

So let's say I had a campaign budget if £100 per day and I set it to standard on average, it would maybe reach 70 or 80 pounds for some campaigns, whereas with accelerated I can guarantee that it would nearly always reach my target. So that was initially why I chose accelerated over standard. But there are other reasons as well and the main one was it was so much easier to control the flow of traffic; so one of my favourite things used to be to bid really really low, as low as possible, for the keywords I'd chosen but I ensured that I had accelerated delivery so that despite the fact that I would have low ad visibility, the sheer volume of traffic potential would ensure a constant stream of clicks at a low-cost but with high volumes of traffic. So I could generate really low cost per sales, cost per leads, generate decent ROI, and essentially if I wanted to increase the flow of traffic, even more, I could just slowly increase bids and that was how you would control traffic flow essentially and how you would maintain and control your cost per lead.

MC: That makes sense. So the accelerated is guaranteeing you are filling in the cracks between the inventory even on more expensive yet in phrases.

RL: Yeah. However, recently and I'd be really interested to hear from other people if they've noticed this as well, I've noticed there's been a change, and the only way I can describe it is the ad delivery algorithm and when Google decides to show adverts and generate impressions and what I found is that if you set a low budget or if you set any budget that's not within what Google may deem an acceptable level, ads will just not show or they will show but the impressions that are being generated is so low, as to just make the campaign completely pointless to run. and what I found is that if you increase the budget, if you double the budget, then suddenly that flow seems to reach a threshold where ad delivery starts to take place and kickstart how I would expect it to be. So basically ensure the more you're prepared to increase your budget by, the more traffic potential there is.

So what I've been finding recently is where I've got campaigns that have a low budget, let's just say I have a particular campaign where I only want to spend £10 per day if I set it to £10 per day, using Google's new “improved standard delivery” I won’t generate any clicks sometimes for 24 hours, but then if I suddenly double the budget then I get standard normal - what I would deem to be a standard delivery taking place and then I suddenly start getting traffic but interestingly I will quite often meet my full budget, even with the double budget set. So what happens is the more I increase the budget by the more traffic I'm generating and the lower I set the budget sometimes I don't get any traffic at all. So what I found is that where I've got campaigns where I have a set low budget for a reason, I'm having to double the budget and double the spend so I've been having to come up with workarounds such as doubling budget on some campaigns to get them to run in the first instance and then when they've accrued a certain amount of spend send me an email notification or to automatically pause it, so I'm having to find workarounds just to get certain ads to show and certain campaigns to run now.

MC: So it's like their threshold to join in, in the delivery inventory has been raised basically.

RL: Yes, absolutely.

MC: And that previous tactic of using accelerated to get in between those cracks of I guess, Google’s scheduling ahead that inventory is closing.

RL: It's really interesting as well because in one of the previous podcasts we discussed optimisation score and how one of the biggest impacts of optimisation score is the amount of budget that you set. now I think it was just the other day I was looking at a campaign of mine and it had a 50% budget score and I went under the recommendations tab for that campaign and it said to increase your budget and if I was to do that it would go up to 96%.

MC: That is a high percentile improvement there!

RL: So while optimisation score doesn't have a direct bearing on the campaign delivery, in this instance in an indirect way it kind of does - Google saying look if you increase your budget you're gonna start getting traffic - but actually that shouldn't be the case the amount of budget that you set shouldn't, in my opinion, dictate that initial flow of traffic that you get. So this is just an observation that I've recently found over the last couple of weeks, as a result of it I've had to re-optimise a lot of my clients’ accounts to factor it in. So I think if anyone has noticed a lack of traffic in some of their campaigns or just erratic traffic generation maybe one of the things they should look at is whether it could be attributed to this delivery algorithm change if indeed that's what it is.

MC: And one way to test that would be spiking your budget into chunks, rather than setting the lower daily amount.

RL: Yeah, trial increasing the budget. Another thing maybe worth doing is if you're using smart bidding and you've got a bidding cap in place, a max cost per click bidding cap, maybe as a test to remove that and see what Google does with it or maybe even trial manual bidding for a while and vice-versa if you're trading manual bidding, maybe try automated bidding, just test and see what you can get away with, but if you do increase your budget obviously just bear in mind that Google may suddenly decide that it wants to spend all of that budget. So don't assume just because you've had a really slow traffic campaign for the last month or so, that it's not going to make any difference by doubling your budget, it may very well spend that budget now.

MC: ISP data, so Internet service provider data through Google Analytics is something we're going to talk about now. I guess it's very related to search marketing so we don't normally touch on analytics here, but Internet service provider data has been in Google Analytics for quite a long time now. I got a message from you again, the other day…

RL: “I’m not happy, mum, there’s been a chance!”

MC: ... saying that it's gone right?

RL: Yeah. Well, that seems to be the case!

MC: And actually I showed you about half an hour after you said this, something just randomly on my Twitter feed, I came across a chap called Gildner who said, ‘we're no longer seeing service provider ISP within Google Analytics says no longer supported, does anyone else know about this?’ so the kind of error he's getting is no longer supported by the names of the Internet service provider ISPs used by visitors on your site and this data has been quite useful basically for especially B2B campaigns, hasn’t it?

RL: Yeah, it's transformed how I approach B2B analysis actually.

MC: Do you wanna talk through what I guess people, in general, have been using this data for in terms of Google Ads and the impact?

RL: Sure. So a lot of large businesses will have their own registered broadband line, they'll have a business a leased line and that line will be registered normally in the name of the business and essentially every single service provider that's driving traffic to your website, you can view the name of the service provider in Google Analytics. So for a small percentage of businesses that have a registered leased line, you can actually see which businesses have visited you.

Now there are all sorts of services online that offer you this, they’re all paid for services but for years Google Analytics has offered this data for free. And there's so many valuable things from a pay-per-click perspective - I like creating audiences, remarketing audiences for people in certain sectors, so I'd always have a segment created for people who work in counts or government, people who work in health care, then I can show tailored ads to those people for remarketing purposes. And it's just been really valuable as well for our lead generation clients who want to have more than understanding of how certain big leads they received discovered them because it's not just tracking through pay-per-click, it's through organic channels, email marketing and how those big corporations discovered them. So it's been a really valuable report in Google Analytics.

MC: I couldn't find any more information about this, as in I did some searches, I can't find any statement by Google saying that they've removed this data or why or any plans to do. so I wouldn’t assume it is a bug though, I'm guessing it is now removed?

RL: It's interesting though because I noticed it because I was getting a lot of not set reporting data flow through into my service provider reports.

MC: Which is the same we got when they removed the organic keyword so they went to not set, didn’t they? Oh, not provided, sorry.

RL: Yes. So not not set normally comes through in as an error of some description or when it's not what the information that you're planning through into the report isn't compatible with what you’re requesting. But the message that you shared with me the other day from Twitter had an error message, I haven't got that error message on my Google Analytics account.

MC: So I think that actually came from them hovering over the dimension named the service provider.

RL: Interesting.

MC: So if you hover over the little exclamation mark.

RL: Yeah, that's interesting. I’ll try that.

MC: You carry on, I’ll load up analytics and let you know.

RL: If that is the case, to me it sounds like a given. I wonder if there's been complaints because I know in - I hate the word GDPR, it is so dangerous, we’ve got to be careful what we say but I know in GDPR world, there's lots of questions as to whether or not that kind of data fits in with the whole GDPR thing.

MC: Yeah I was thinking about this because I mean the thing - you’ll have to excuse me, I'm trying to browse analytics while speaking here - the thing that interested me was always that IP addresses were classed or are classed I should say, as PII, personally identifiable information, under GDP are which is why we have to anonymise IP addresses with GA and we're not allowed to store them in server logs etc. And that interested me because when there's been court cases, they have thrown out the idea of using an IP address to identify an individual, because obviously IP addresses firstly are shared, so my IP address in six months time might be someone else's and obviously other people can use the same network because your IP address normally comes from the box that's connected to the Internet, whether that is a router or the networking infrastructure, so it might have one, two, ten, fifty, a hundred, people shown that same IP address.

So to take that step further to use the ISP data that surprises me and like you, I'm not a GDPR solicitor, it's not my area of expertise, but it would surprise me if that's the law that's affected that.

So in our analytics we've got the same notice, if we hover over it it is no longer supported so we're guessing shrug emoji that it’s gone, for good..

RL: Yeah… it's going back to GDPR, I have seen instances of people who have a leased broadband line registered in their name, which obviously shouldn't use Google Analytics to find personally identifiable information, but in those cases that person was giving you that information because they'd registered their broadband line, under their personal name which I just thought was interesting.

MC: Yeah those interesting obviously the terms and conditions of Google Analytics are you're not allowed to store personally identifiable information analytics, so I've seen cases where email marketing has used that person's for instance email address in the UTM campaign tracking and then got into hot water with Google because they're storing people's email addresses.

RL: So I don't know what's gonna happen now, but presumably there'll be other service providers that offer this service. I know there's some big companies out there that offer this, it all depends on whether Google have taken this step because they've had some legal consultancy about it.

MC: They're just fed up getting fined every ten minutes.

RL: Yeah, there’s that.

MC: Yeah I guess we will update you on this podcast if we do find out any more about that. So don't worry if you use ISP data and you've started noticing that is no longer available or if you use it and you haven't checked recently. So if you can't see it that it’s coming up as not set, hover over and see if it says no longer available. It's not just you, we’ll see if we can shake some Google trees and get some answers and bring them to you later.

I think that's everything.

RL: I think so, you can have a Lemsip now.

MC: Well thanks so much for joining myself and Rob. We will be back again in one week's time, which will be February 17th 2020, going through the year very quickly. Please do subscribe if you enjoy the podcast, if you're listening through our website or through an online player, it's available pretty much every podcasting platform I could find to get it on, look us up and do subscribe and we'll see you in a week.

RL: Bye.

More from the blog