Candour

Episode 29b: September core update, Rich snippet controls and GSC data

Play this episode:

Or get it on:

What's in this episode?

Mark Williams-Cook and Rob Lewis will be talking about:

September Core Update: Looking at the first ranking change data that is coming in and discussing what you can do about core updates.

Rich snippet controls: Google's new tags that allow you to control how your rich snippet results appear for text, images and video.

GSC data: Google Search Console improving data refresh times.

Show note links:

Google announcement on rich snippet control: https://webmasters.googleblog.com/2019/09/more-controls-on-search.html

Developer docs for rich snippet control: https://developers.google.com/search/reference/robots_meta_tag

Google announcement of Search Console data refresh: https://webmasters.googleblog.com/2019/09/search-performance-fresh-data.html

Google tweet for September Core Update: https://twitter.com/searchliaison/status/1176473923833225221

Microsoft announcement of new targeting options: https://about.ads.microsoft.com/en-gb/blog/post/september-2019/audience-targeting-solutions

SISTRIX initial winners/losers: https://www.sistrix.com/blog/google-core-update-september-2019-first-results-visible/

Transcription

MC: Welcome to episode 29 of the Search with Candour podcast! Recorded on Friday the 27th September 2019. My name is Mark Williams-Cook and I am joined again, by Mr. Rob Lewis.

RL: Hello.

MC: And in this episode we are, of course, going to be talking about the Google September core update. We're also gonna be talking about the new rich snippet controls Google has released as well as yet another update to Google Search Console. We were going to actually do some listener Q&A; this, this episode we asked for some of your questions at the beginning of the week and we got some but then Google went and dropped a core update so we're gonna cover that first and most likely do the Q&A; next episode.

Well, here we are again Rob!

RL: Yes episode 29b.

MC: 29b. So we've already recorded this episode and there was a catastrophic failure of audacity being able to save so, here we are again on take two and we really know what we're gonna say this time.

So kick off notice, i just want to say, firstly well big thank you to everyone who is actually deciding to subscribe and listen to this podcast. I mentioned on current social media, the other day, we had an inquiry come through to us for someone looking for some PPC and they actually mentioned ‘hey we listen to the podcast then it was one of the factors that pushed us over the line to come speak to you’ which was great because I didn't actually expect it to have that kind of impact. I'm well aware because it was PPC Rob, it was probably things you've been talking about and not me.

RL: Happy to help!

MC: Before, yeah before, I let that go to my head but someone actually, someone else did kind of comment and say they tested out the podcast on, they would do like a session work where they were getting people to listen together to podcasts on lunches, like an education thing, and search with Candour was the only one that everyone, you know, anonymously liked which was really cool to hear! I'm well aware there are probably people that don't like it as well, but either way I you know, would love to have your feedback. You can get to us on social media or just email me directly if you think of something you want to hear more of, let us know or if you think someone could do with improving ie you think it's rubbish - let us know as well because both, both sides are really really helpful.

Cool, so the first thing I want to cover today is the Google rich snippet, kind of, control update and this was a blog post that they put live on again, their Google Webmaster blog which I'll link to in the show notes and the show notes are at search.withcandour.co.uk - that will get you to basically all of the episodes and if you can go into any of them or have the show notes links and transcriptions!

So, word on transcriptions as well, the person who helped me, well it's unfair to say helped, he was just doing them, with the transcripts Ayush has followed the gold brick road to London, so he's no longer with us doing these transcriptions, which makes me very sad. I haven’t had time to do them myself, I know there's other options available - so I mean the way we were actually doing them was using some of YouTube's Auto captioning to get it started and then rewriting.

That was a bit more helpful and included images and stuff like that. I don't have time to do that at the moment, it will happen again soon. I'm aware there's a little bit of a backlog - I didn't actually think that many people were using the transcriptions and I've had three people contact me about the last episode saying ‘yo where the, where the transcriptions’ so that really surprised me! So the transcriptions will come back, I'm aware there are services like Rev where they're super cheap. You can pay like, I think it's like a dollar a minute for human done transcriptions but they're not the same because there not organised in the way we did it and I just think if we're gonna do it, we should do it well.

Anyway I went off on a tangent there, so the first thing we'll talk about is the Google rich snippet control update, which they posted an announcement on their blog on Tuesday, so is the 24th - it'll be last week, when this podcast goes out. And this, this feature isn't live yet Google said this is going live in mid to late October, so we've actually got about a month to implement this if you so wish and so what it actually does rather than, so sometimes I go through the Google announcement with you and pick it apart it's quite a long post in my opinion that doesn't say too much so I'm just gonna hopefully give you a helpful summary of it.

So what this feature does is it allows you to control how your rich snippet data is displayed in Google search. So rich snippet data is when Google is kindly taking apart somebody's website - their website content - and putting it directly into this search result. So this is part of their multifaceted quest to become an answers engine, where they're just answering your question, your intent, immediately in search. So we've actually always had the option, for quite a long while now, to just opt out of having your website's give rich snippet results using the no snippet tag. But what this update does is, it gives an extra one, extra use to the no snippet tag or more specific use and there's some extra tags you can use as well, so you get this more granular control.

So I'll go through them so Google's given us a max snippet and then you specify a number and this is a new meta tag that lets you specify a maximum text length in characters of a snippet for your page. so you can limit how much Google can show, you know, rich snippet result. you have to be quite careful I think on that because there is a minimum before Google won't show the rich snippet and I don't think that's defined. feel free to correct me on that if I'm wrong. I don't think it's defined as a minimum, what they need for a rich snippet but keep it sensible. They've also given us max video preview, again you specify a number for this and that number is the seconds of the maximum duration of an animated video preview. So if you've got a video you can say: okay I've got this really great, you know, it's a one minute 20 video but I only want you to show up to 30 seconds of it. And lastly there is a max image preview, which takes a setting not a number and that's a meta tag that lets you specify the maximum size of an image preview, for images on the page and you can say either none, standard, or large.

Being slightly cynical, I'm not sure this is, kind of, something webmasters have been chewing at the bit to get a hold of, I feel this might be a move to respond to various kinds of antitrust cases and whatnot Google's facing at the moment, in regard to them essentially taking people's web content which they've normally had to pay to get or pay someone to write for them and essentially circumnavigating their site, their platform and just providing that for free on Google search. And actually monetizing it as well because it's that content that people are trying to get hold of and they want Google to get that content and then Google's showing them ads which was essentially monetizing other people's content without them getting a bit, obviously.

So they have positioned themselves in a win win, in that they can say, well you know if you don't want us taking your content, you can opt out and actually you can control it at quite a granular level, but for those people who do want to do that the option is there and of course, the argument is that they're always kind of circular, comes around with this, is that it is what users want in that, you know, people want their answer as quickly as possible, as frictionless as possible. So they will always you know, users are never gonna be like oh I wish I had to click on that site to get the answer and if you don't want to provide that answer in the search, someone else will.

There is another change they've given us with the no snippet becoming an HTML attribute now so before you could opt out completely with no snippets but they've got a new HTML attribute which means you can actually specify specific parts of your page or content that you don't want to appear as a rich snippet - which is quite useful so they've given an example in the docs where they just want one particular part of a sentence not to appear in the snippet. Now, I was trying to think of some use cases for this, before I came and recorded the podcast, and one that occurred to me that I think we'll probably see happen just because I know how SEO people's minds operate, is I would guess that some people are going to use this kind of, inline data no snippets to maybe prevent people from getting the precise answer they need in the rich snippet, so when Google shows the result you can kind of, snip off the end bit, the actual kind of, bit of information that completes the puzzle so they have to click on the rich snippet result to get to the page to read that.

Now, I'm not suggesting firstly that speculation but I'm sure someone will do it, secondly I'm not suggesting that's a good idea; I think what will happen is Google obviously has a way to measure what they think is a useful rich snippet or not, so they'll be looking at things like if people actually need to click on it, if they're clicking on other results because they're not getting the answer they need, or whether they're doing the search, reading the answer and then from the next searches you can ascertain that they will probably satisfy the new information they've got in that snippet. If you believe that to be true, which I think makes a lot of sense and there's evidence for that the logical step is that if you purposely make your snippet less useful, that it's highly likely Google will replace it with someone else's snippet.

RL: Are you suggesting that people sometimes manipulate Google for their own ends, Mark? Is that what you’re suggesting?

MC: Something like that. So I would hope, so my, I actually have, I actually have a golden rule for SEO, which I discuss with many clients. Which is that in my opinion, if you're doing SEO correctly, there is absolutely nothing you should do that detracts from the user experience. So that strategy, to me, I would never recommend it because to me, that detracts from the user experience and yes, while I kind of just mentioned, you may get more clicks in the short term - my view would be in the long term, you would lose that snippet and then you would be in a worse position than when you started because you may be getting some before; people that want it in more detail, people being exposed to your brand, providing that answer the rich snippet likely would have been used, can be used by other you know, devices like intelligent personal assistants, so they can say you know, this is the answer it came from here, so if you make it less useful you're going to lose all of that.So you know, manipulating is interesting choice in words - I mean, that's why it's called SEO optimizing and not sem manipulating I guess.

Another interesting thing about this new feature is Google is treating it as a directive not to hint; so Google tends to classify things like this the way you can, as a webmaster, specify things as directives or hints; directives are generally things they just obey and hints are things they take under consideration that may ignore. So for instance, the canonical tag where you can specify to Google that this page is just like pretty much the same you know, page B's pretty much the same as page, so kind of ignore it. Google says: fine thanks for letting us know that but we're going to analyse both those pages, we're going to look at our own signals and we're going to come to our own conclusion as to whether we take that on board and action it or not. So this is actually a directive, so it kind of as you'd expect, it's your content if you don't want it shown Rich Snippets, that's your prerogative.

Bear in mind as well, when we talk about which snippets, they are very separate from normal what we call organic results. So Rich Snippets we’d generally refer to as position zero so they're the featured boxes at the top of the result most of the time and the organic results are kind of, the fast fading one to ten blue links. They are quite separate systems it seems in that, if you did decide to opt out of having featured snippets, you know, this isn't going to affect your normal organic ranking although obviously you may get less traffic because you're not featured at the top, they're not things that seem to be connected in any particular way.

As far as we know as well and as far as I know - again, happy to be corrected if anyone knows different - this new feature isn't currently supported by any other search engine so bing isn't supporting it the minute, it seems to be Google has come up with and is the only one offering support for these tags. In terms of implementation, I will link to the developer Docs in the show notes search.withcandour.co.uk but for the tags we mentioned, so the image preview/ video preview of Mac snippets, you can deliver them, you can actually deliver them through the X robots header, that may not mean anything to you.

You may be more familiar with the on-page meta tags which can be used, so the meta robots and that means we can inject them in the normal ways we do that, whether it's hard-coded into the page or whether you want to use something like Google tag manager you can put them on with JavaScript. Although again and as we've talked about before, if you do start adding these directives in with JavaScript, it means that Google needs to process the JavaScript before they'll be seen and therefore obeyed and we've done experiments before we have you seen it can take over three weeks from when a page gets indexed until Google finally gets around to processing the JavaScript because that's resource intensive for them. So if possible, if you're going to use them as meta tags just put them directly on the page.

So that's going live as we said mid to late October so you've got three, four or five weeks to prepare for that now. For four weeks now.

Got a short and sweet update from Google Search Console, we've had a couple of these recently. So last episode we were talking about the breadcrumb update that's in in search console so you can diagnose any issues with breadcrumbs and they did a blog post again on the Google Webmaster blog, that has announced fresher data in your site's search performance report and unsurprisingly, Google said that this is the users number one feature request which was improved data freshness.

So this update basically just means that we can now see data as recent as less than a day old and previous leads are up until now we've had to wait a few days, like two, three, four days before data within Google Search Console was refreshed, which kind of made it difficult to use in some situations as a diagnostic tool. If for instance, you're doing a site migration, you put the new site live and you're desperately trying to check to the best of your ability if everything's working correctly, you don't have any errors, having to wait another few days first a direct line from Google as to whether it’s having issues is difficult so it's really great we've got that. It's been combined with some other updates we've had over the last months to Google Search Console, so again one of the big ones is longer data retention so all of these things make it more useful as a, as a tool to use daily.

We actually spoke in the version A of episode 29 about shadow IT, which was a term I don't think you’ve come across before.

RL: I haven't heard of it, no.

MC: And in fairness, I'm very new to it. Someone introduced it to me, only a few months ago and you know I've worked with computers pretty much for my life but I wouldn't say I kind of work in IT and the way they described it to me, if you haven't heard of the term shadow IT as well, is basically when you get a problem within an organization, that can basically be fixed by a technology IT software solution. But there isn't like an off-the-shelf package or SAS or whatever, to do that. Then people cobble together their own solutions or sign up to their own kind of, third party arbitrary tools to fix that problem and the company as a whole kind of, isn't aware that thing exists and I certainly think there's a lot of shadow IT solutions to Google Search Console in the past.I've seen all kinds of scripts and tools people use to archive the data because before, to begin with you'd only get like 30 days data and then it was gone forever and this is an extended and extended, so having all these updates with more fresh data, storing it for longer certainly is gonna make it as a standalone tool more helpful, which is great.

We did see the sun setting of the old Google Search Console interface a few weeks ago now, much to the upset of many SEOs we haven't quite seen all of the features from the old search console migrated into the new ones and there's certainly some concerns about some of the data and the new search console - it's a little bit more opaque in cases. There's some interesting examples where figures seem to be very different, wildly different or just kind of wrong, so again that's probably, to be honest, a trend I think I've seen with Google over the years, about taking a little bit of power away from webmasters but giving them different types of tool. But generally I think it's a you know, got to take the rough with the smooth and it's a useful update it's really good we can we can get that data now.

And of course we are going to talk about the Google September core update. I'm kind of disappointed now that we've got this naming convention, so before the earliest kind of Google updates I remember we're called stuff like Florida and Big Daddy and things like this and we've been through some animal names - you obviously had everyone it was pretty much about penguin and panda and hummingbird and stuff like this and we seem to be settled now on these very dry names. So the official announcement of the September core update and I do appreciate Google giving us these updates, so you know they don't have to do this and norm that the kind of tradition before has been Google's to drop an update and don't really say anything and then lots of things change and then people come together in collective panic and realise that there has been an update because everyone else is panicking or celebrating. So I do appreciate that they give us these heads-up even if it isn't very long, it doesn't need to be long because basically there's nothing you can do because they're not telling you exactly what the update is.

So there's the search liaison account on the 24th of September, that said ‘later today we are releasing a broad core algorithm update as we do several times per year. It is called the September 29 core update. Our guidance about such updates remain as we've covered before. Please see this blog for more about that.’

It does irritate me, as I said before that Google call it a blog and not a blog post but hey, that tweet itself even is a copy and paste of the one we saw in June where they said exactly the same thing but they replace the word September with June. They've obviously had a change in policy where they are going to make webmasters aware ahead of time. There is something to take from this again though, which is if you do read the blog post they will talk about the fact that this is a core algorithm update meaning, if you find your site goes kind of down in the rankings after this that this doesn't mean that there's a specific problem with your site you need to fix and there's this blurred line that I see danced across a lot where people get confused between Google penalties and core updates so I think it's worth just going over this once more. We did a whole episode as well on all about Google penalties - I think it was number 16 - I'll link to it in the show notes if you want to check that out.

But there is an important difference here, it's worth reflecting on. So if we have one of these core updates and you find that oh no I've lost, you know, I've lost loads of rankings that's not because you are specifically suddenly doing anything wrong. What it is, is that Google has tweaked their algorithm obviously, to get closer to their end goal of the type of sites they want to rank for certain queries and it's been deemed basically, that other sites now fit those criteria better than you do, so you haven't moved down, everyone else, all those competitors have moved up and the result of that is, the waters have risen around you so you are now lower.

Whereas a manual action is or Google penalty is when, just your site is affected and from the inside the effect looks the same which is ‘oh no you know we've lost our rankings’ but the kind of paradigm is completely different because no other sites were affected - if you get a manual penalty that's specifically targeted at your site and you have been demoted, everyone else has stayed the same. Whereas a core update applies to everyone so it's everyone else going up or at least more people going up around you and then down, so the impact is still the same. So I really think it's worth thinking about that because I've seen lots of people after these updates be like ‘oh no we've lost rankings. What is it we need to fix, we need to do an audit, what was it we can fix’ and generally the answer is, there isn't one specific thing you need to fix, it’s you need to look at what your competitors are doing differently, why they ranking better, look at things like the Google search quality guidelines and just how think about what they're doing because the likelihood is there's going to be a whole roadmap of stuff you need to improve over time.

So in our A version recording Rob and I had a weird a discussion about this but what was it you'd asked me?

RL: I think I said, playing devil's advocate, even though, even though there isn't technically anything wrong that the advertisers doing they're still gonna want to react to it to get traffic up and to compensate

MC: Yes if they've lost they're like…

RL: Yeah, yeah like if they lost a search, for example they have a search query that was generating revenue for them and that suddenly stops they're gonna want to get that revenue stream back - how can they revert that?

MC: Yeah so that's what we spoke about wasn't it? So it was actually talking about - look if that has happened to you, you need to kind of get it out of your head that you can just reverse this because the algorithm has been updated, it's not something you've changed and it's going to be a combination of many things. Now as more information does come out about the update it may be that we, you know, you've you've approached this so we spoke to Tommy Chesky a couple of episodes ago from the user story about how SEOs can improve sites by taking this user experience approach because you'll hit a lot of, kind of things Google are looking for without even realising it, rather than trying to take this bottom-up approach of check boxing things that at least you know impact it. So it might be as we get more information about the update that we know it's looking more closely at some certain factors, so it might help in prioritization but the general approach needs to be ‘look, it's the algorithm update, you can't reverse that, there's not going to be a sudden fix.’

You've essentially got two roads to go down. One is the kind of analysis of why your competitors are doing better than you, what's their user experience like, what's their content like etc and building a road map for that, prioritizing it maybe around what we know about the algorithm update and the other is there is some short-term stuff we can do, all the basics about you know, marketing, digital PR, producing great content, doing outreach, getting links, you know upping the amount you're doing with that. You know, links still play a part in Google's algorithm; we still day in day out see websites that win decent links get better rankings, so there are things you know you can do in the short term but it's not this view of ‘what's the problem to fix’ it's more what can we improve and what order should we do it in.

So we've got we actually have got some data about the core update which is why we kind of elected to speak about it so we tend not to talk about these things until we've seen something and SISTRIX did, so I spoke to SISTRIX yesterday and they said they were putting a blog post out this morning, which they have about the update. I really like what SISTRIX is doing at the minute. They've been really on the ball with interacting with the community and sharing their data on these updates so as usual they've, they've published like a winners and losers list so we can see you know which sites have really gained from this update in which sites have lost out.

Interestingly, overall like we saw back in June, it looks like primarily health and news sites have been affected which to me says it's likely to be like a tweak of this core update they did in June. And another reason to think that is in June we spoke about the Daily Mail site, much to some people's glee, have lost I think it was like 50% of their organic traffic overnight and about 90% of the discover traffic and it turned out that it was actually their SEO director on the Google help forums trying to get to the bottom of it; there was no help available and you know all the tools make it look like they've just kind of slogged out this traffic loss. So they have actually been a big winner in this, in this update. it looks like their traffic's bounce-back is doubled so they seem to be back in where you know they, they believe they should be.

Other sites as I said have been, so that's one reason actually I think it might be a tweak the old core update, well i say it’s old, it's only June. But the previous core update the other winners and losers on the SYSTRIX lists were quite interesting. So they listed three health sites and two on the losers list which were verywellhealth.com and organicfacts.net.

Now, so Rob smiled when I read about these domains because probably like me, when you see domains like that and you, you're a heavy webuser, you kind of get a sixth sense for the quality of the site you're gonna go to based sometimes on the domains…

RL: I’m surprised that there wasn’t a .biz on the end to be honest.

MC: So I had a look at these sites and actually you know to be blunt, very sorry, very well health and organic facts - they didn't look great to me. You know, in their health sites and health sites fall under what Google call and your money or your life category of sites, which is a categorization of sites where it's very important that the information is correct or actual harm can be caused. So there's kind of a different level of kind of vetting if you like or scrutiny I should say on those types of sites because it is obviously important if you're googling something and it gives you a terrible result, you know, it's telling you to do something you know medical that can cause you harm, Google needs to you know keep an eye on that.

So when I looked at these sites you know they didn't look terrible by any means, but they certainly didn't look what I would expect to see from my high quality site in the medical area. So I was like, okay maybe they've done a good job here because certainly in the June update we, I was looking at the lists of medical science that had big losses and when I was going through them I was just like yeah good, good, good because some of these are really bad sites and you know, some they’re just is not true essentially what's in some of them, so I was like they're doing a good job. So then I looked at the winner, one of the winners which was patient.info and obviously dot info tld makes the hair on your neck stand up! So I looked at it and to be fair it looks really similar to the loser organicfax.net in terms of it to me looks like a pretty off-the-shelf temp blog, kind of template site the categories and everything look like they've been designed primarily with search in mind rather than a user, like it's not a bad site again but it didn't look you know, it didn't look massively different at first glance at least to one of the losers.

One thing I made as an observation and I'm not suggesting this is correlated, it was just an observation obviously I can't use my samples they have three year, but the two loser sites very well health and organic facts when I looked at them, both of those had affiliate links on them and they were very upfront about it in fairness, they had a page explaining that they receive a commission for some of the links and here's how it worked. But the patient.info which was the site that saw the big increase didn't have affiliate links, they did have advertising which is arguably the same thing, arguably not as well but I didn't seem at least on the look I took, to have affiliate links and that just gave me some food for thought around again the category of your money or life sites, health sites, users being able to trust content, trust recommendations, when the webmaster or the person producing the content is obviously getting Commission for selling products or services.

The other thing of note in the SISTRIX summaries, of their blog post and again our link to the SISTRIX blog post, in this show notes at searchwithcandour.co.uk, well as they tailed it off with in Germany and Spain, we've also seen some examples from the travel industry so travel type websites. so again, this will be interesting as this folds out because we're still really early on, we tend with these algorithm updates, to see big sites get impacted first because they have much bigger footprints in the kind of link graph so a lot of this stuff does have some, normally there's some overlap with how link profiles are analyzed so we tend to see this effect on big sites first and then it trickles its way down across the web. So it's only been three days so it'd be really interesting maybe next week to catch up, see what new data has come out but again like the like the kind of last podcast we did on this there's not a massive amount you can do right now so just hold still.

When, when we were actually, when I was thinking about and researching this, the data on this update I was thinking oh you're quite lucky Rob because you don't kind of, have to deal with algorithm updates but then I remembered, we did have a conversation a few weeks ago about you dealing with kind of, like behind the scenes Google Ads updates like algorithm updates how Google Ads is working.

RL: Yeah it's very difficult to explain, but when you've been working with Google ads AdWords for so long you see things that used to work in the background no longer working and I'm not talking about changes to the interface or changes to the behavior of website traffic, I'm talking about things that you used to do within the interface itself, in order to generate impressions or clicks no longer working at all, so you have to stop doing that entirely because it literally doesn't generate a single impression.

So for example, about seven or eight years ago, it used to be good practice to add really long, tail niche keywords to target because they would be low traffic but high conversion and all of a sudden those types of keywords you'd no longer be able to add to the account because Google would put a block on showing ads unless there's a significant volume of people searching for them. So things like that suddenly, overnight would stop working so you think all of these longtail niche campaigns stop generating impressions entirely, so you'd have to try a different tactic - to capture the long tail searches, that's just one example. But there's lots of things that would literally stop working or not work as well, not drive as much traffic so you'd have to work and I guess you could call them changes to the Google Ads algorithm, that happened in the background which aren't announced but if you work in pay-per-click, day-in day-out then the manager will notice it, they'll notice that that doesn't work anymore, I have to change that and you have to react to it.

MC: The attitude on the organic updates tends to be Google will say, okay we're making this update, whatever, the chips are going to fall as they may, basically, with your websites and it doesn't affect Google immediately and directly in the revenue that they generate so the quality of the organic results is paramount to Google, so they did over a hundred billion dollars last year in advertising revenue, primarily because Google choose sorry, because people choose to use Google because it gives good results so if they get bad results obviously they wouldn't have the audience for the platform.

But in the short term, if they you know, if the Daily Mail loses or gains traffic it kind of doesn't affect how much revenue Google's generating whereas I guess, if they were to be very specific about the updates they were making, maybe to Google Ads platform, that's gonna change account managers behaviour so if they now say, oh we're looking at this or this works like this, you're obviously going to the best of your ability going to optimize towards that, which could then have some interesting effects, I guess on how much revenue they generate very immediately.

RL: Absolutely, and I know in one of the previous podcasts we discussed one of the I'm going to call it a shadow update whenever the the

MC: Like our shadow IT?

RL: Exactly, they made the update of the location targeting where they basically broadened up to the location targeting options so you can no longer just target by people in a specific area, it would broaden it up to target people who have also regularly or recently been in that area. That is what I would call a brazen update, that you know is quite an obvious change that eventually will get announced but there are these shadow updates I guess Google probably make behind the scenes that don't really impact advertisers a great deal or maybe they don't even impact the pay-per-click manager a great deal, but will influence revenue for Google. I guess they probably have a team of people there, whose sole job is to incremental make these minor changes to the Google ad algorithm that, that will maybe increase revenue by 0.025%, but when you’re talking about billions…

MC: Which is still more money than I’ll see in my life!

RL: Exactly, yeah so, so yeah these things to consider.

MC: Yeah so, I mean that makes perfect sense that they'd have people essentially looking at you know if we're not if it's not negatively impacting advertisers there's no reason why they wouldn't want to optimize their platform to generate them as much money as possible because that's the business you know that they’re in, isn't it? - that's the reason why that platform exists, is to generate the money.

So as expected, this has taken quite a while so we're gonna end the show here and we will be doing or the plan is unless Google released some great new features or decide to do a beginning of October core update, we'll do a Q&A; we'll do a Q&A; episode next Monday so that will be the 7th of October, right yeah 7th of October, and so if you do have any questions about SEO or PPC that you would like to hear discussed or you have some specific SEO questions and you're struggling to get decent answer from them and we can do our best to try and give you a decent answer.

Feel free to email me at [email protected] or you know find me on Google, it's very easy to do, so please give us those questions otherwise as usual these show notes are online at search.withcandour.co.uk

Did I talk about the transcriptions in this episode, in this version B? I did already, yeah and I'm having trouble between remembering what was in version A and what was in version B, what we've covered, what we haven't.

RL: It's like we're in a parallel universe or something.

MC: Okay, so we're gonna wrap it up and I hope that you will subscribe if you're not already, and I hope you will continue listening, please give us feedback and I hope you have an absolutely brilliant week. I'll see you later, goodbye!

More from the blog