Episode 35: Big site SEO with Andrew Smith

Play this episode:

Or get it on:

What's in this episode?

Mark Williams-Cook speaks to SEO expert Andrew Smith about large site SEO, what are the differences between small site and large site SEO? What tactics and strategies are particularly effective for large sites? How do you prioritise? Is there much cheating going on between the big brands?


MC: Welcome to Episode 35 of the Search with Candour podcast! Recorded on Thursday the 7th of November 2019. My name is Mark Williams-Cook and today I'm joined by Andrew Smith.

AS: Hello.

MC: Andrew is a SEO consultant, founder of Olivepods and ex-Head of SEO for Expedia, CheapFlights and eDreams - some of the biggest names on the web! We're going to be talking to Andrew today all about big site SEO, the challenges and the things he's learned over, well over a decade in SEO.

Andrew thank you so much for coming from Barcelona, to a slightly uncharacteristically cold rainy Norwich. It's 8 degrees, how does that make you feel?

AS: Well thank you for having me, it's good to be here. I'm not really sure it's uncharacteristic though is it?

MC: I'm trying to sell it, that it’s uncharacteristic! Normally, it's almost, you know you couldn't tell the difference between here and Barcelona!

AS: Yeah on Monday it was about 25 degrees in Barcelona. It was nice, here it's freezing cold and raining but it's all good.

MC: You're gonna do a talk tonight at Search Norwich which is brilliant, all about big site SEO and I would say big site SEO is something you particularly have a lot of experience in. Do you want to just define what you mean when you say big site SEO? Whether that means literally, like a site with lots of pages or lots of traffic or both and what kind of scale are we talking about?

AS: Sure. so yeah it is sites, I don't know I mean I suppose my definition is of a big site, is probably anything over like a hundred thousand pages and there's quite different challenges that you deal with. In terms of traffic, I don’t know, you’re talking like millions and millions of visits you know per month, so it varies from market to market. But I think it's also if you work at a company that maybe has a site in every market, perhaps it's got like hundreds of markets in it, it's not only one site that has a million pages, it's maybe 40, 50, 100 sites that have a million pages so there's different challenges that come with that. So for me that's in online travel, that's where most of my experience is generally, it's websites that have around about the 1 to 2 million pages mark.

MC: That is big. So what would you say yourself is the main differences between big site SEO and small site SEO?

AS: It's the scale of problems that you have to deal with. so for example, what do you do to figure out content across, you know, if you've got a small site with less than a thousand pages, you can figure out content for that, you can figure out the content that's gonna go on most of that site, maybe it's gonna be a lot of blog content. But when you've got maybe 50,000 roots you know flights from X to Y, how do you figure out content for that? how do you figure out exactly what's gonna go on those pages? what's gonna be good for users? how are you gonna do internal linking? so if you've got a thousand page website it's quite easy to do internal linking and maybe get everything like four clicks away from the home page, but how do you do that when you've got a million pages to deal with or even more than that you know?

So the fundamental difference is, it's the scale of the problem that you're dealing with and generally it just means that it's more challenging, it's kind of heavier lifting that has to be done, you need to come up with much more scalable solutions to different problems.

MC: Okay, so I think what we're circling around here is that some of the technical challenges on big sites; the size of the problem scales with the size of the site. So if you've got a thousand page site and you know your canonical tags aren't quite right or something or your internal linking is not spot-on, it's not going to make a huge difference but if you've got half a million pages, that problem scales with all of those pages and becomes a very big problem.

AS: It does yeah. I think the other thing to consider as well is quite often in a kind of large company, they might have lots of old technology, so good examples when I was at edream. So edreams, edreams as a company, they've bought lots of different brands and each of those different brands have different technology, so to try and harmonize that, is a challenge in itself.

So maybe, yeah you might have old a whole bunch of old pages which have got canonicalization issues or internal link problems or really poor content, it could be a whole host of different things, but the challenge is also - okay, do you as a company actually have the development resource to properly work on that technology, chances are maybe you don't right? So it could be that you've got some old tech that's kind of sitting around, it's still capturing traffic and it's like how do you deal with that as well? So it's both it's that, yes it's the scale of that problem across many many pages and and how do you fix that, but then some of it is also actual technical issue,in terms of the platform that sits on you know what it's written in and and the level of understanding of that and also some of those technologies that may be a bit older, they're a lot less flexible so it's harder to physically harder to actually do that. You know, you don't necessarily have a, some companies are great they have a CMS you know for everything that's all kind of on one platform, it's super streamlined and easy to do and that's as an SEO in a big company, that's what you want, that's what you're aiming for.

MC: It's rarely the reality.

AS: Yeah exactly, I don't think it is the reality for most. It's a lot more hard graft and kind of fixing stuff and lots of smaller tickets to maybe do stuff which is because it's not as flexible as you'd like it to be.

MC: I think that’s definitely something agencies miss sometimes when they try and engage or help these larger sites. Coming from the outside, not knowing the technical debt, the history. You know, I've seen people kind of scoff and be like ‘oh well they haven't even done this’ and ‘they haven't even done this’ but actually it's not because you know, incompetent people or idiots working there, it's actually because there's this multi-layer of challenges; it might you know be internal politics, it might be technology, it might be resource that they know it needs to be done, it's just there are other things that are a bigger fires that need putting out first.

AS: Yeah exactly that and I think some of it is, you also wrap up some of these fixes, like you know SEOs within you know, my teams we know about the issues that are there, but it's just that it's that we've got some other bigger challenges that we're dealing with and some of it is - ok for example, I think about a set of pages that we had on a podo in Germany and those were, there was set of holidays pages, so it was like holidays to a particular destination and actually a lot of that content was coming from a third party and I think it was travel tainment and a lot of that content coming in on a set of pages, which have been around for ages sat there and they've got fundamental problems with how we deal with it.

But we know that in parallel we're working on our own set of pages right? We're working on a set of landing pages to replace these. So it's like, ok once we've migrated and got those pages ready, connected to a CMS that's more scalable, it's like we're going to solve all those problems. So a lot of it is identified and understood and it just becomes, it's a big sort of juggling act in terms of what do you prioritize and what do you focus on and that could be both for a technology level but also at a market level. So we try to tackle problems in a scalable way, so we try to solve an issue that we know once we do it in a particular way we're then going to solve that across 40 markets, rather than just trying to solve it on one or two.

MC: You talked about identifying some of these issues with large sites and there is the kind of run-of-the-mill tools that every SEO will use, like Google Search Console, obviously Analytics, you have desktop tools like Screaming Frog, Sitebulb and then you've got kind of platforms like SISTRIX, SEMrush - those kind of things, are there any particular tools that you find that you go to or have been particularly helpful with big site SEO? because obviously, especially you know desktop tools, it's obvious they're not suitable for crawling millions of pages, you know your laptop will melt trying to do it and even things like Google Search Console, you know the interface itself isn't up to really getting to the bottom of issues when you've got that many pages. So how do you tackle that and what tools have been helpful?

AS: Yeah, so I think you can use some desktop tools if you think about... so don't try to identify every single possible issue, it's kind of my advice. It's like separate the site out into sections and think, okay look we know that we've got issues with a particular section maybe it's your blue widgets section. It's like I'm gonna take that section of the site, I'm going to crawl it, I know that roughly this there's 20,000 pages on travel, it can be quite easy because usually there's some sort of geography data, so you take the geography data and you can estimate how many pages are going to exist based on that.

So you can use tools like Screaming Frog if you restrict the part of the site that you're looking at and limit that based on the URL structure. But one, there's a couple, so one I would say is that look for a decent log file analyzer something like Botify, I've used in the past and it's good, it's expensive but quite often you're gonna get a lot of value out of it. Then I'd say it's kind of custom tools right, so like you talked about Search Console; we would build a little tool that would sit on top of that to extract data out of it, so that we can log it and we can keep it over a longer period of time, we can then do year on year analysis and assessment.

I think another one that's key is your ability to do a/b testing. So traditional a/b testing is, you're gonna split the user behaviour, so you've got one page or maybe a set of pages and you're gonna send the user to a different kind of version of that. SEO a/b testing is quite different; so I call it horizontal a/b testing and you can only really do it well if you've got a very big website to deal with, but let's say for example you want to test out a new piece of content or some sort of new functionality on your flights to city landing pages, where you can split those up, you can split them up across you know, if you know you've got a hundred thousand of those pages you can say, okay look I'm gonna randomly assign this version of the Meta Description or the title tag or the boilerplate content that's on there or whatever else it is and then you really need to be able to compare that and ultimately what you're looking for is an increase in traffic on the version that you're testing but the only way you can really do that is across a very large set of pages.

So if you've got a hundred thousand pages maybe you're going to run it on twenty thousand of them, fifty thousand, whatever right but you're gonna do it at the template level but then you're going to measure the performance at the template level and that I think is that's probably the most important one because then you can approach SEO in a much more scientific way. You can properly assess, okay what is this feature on these set of landing pages? What does it actually mean? What does it actually do? Tripadvisor were probably, I would say the most advanced at that, they were doing a/b testing in SEO like over ten years ago and so there are some features that you would find on Tripadvisor that look a bit unusual and a bit weird and it's because they've tested it and found that it works better for SEO.

MC: So it's maybe actually using some of these tools but using them in a different way, in a more targeted way. but it's interesting you should mention things like templates, I don't know if you've had a chance to look at the you speed report that came out in Google Search Console few days ago?

AS: Not properly.

MC: No, i’ve just briefly looked at it and one of the things I noticed now is, so they’re rating pages by slow, medium, fast but they're actually grouping them by page template they're trying to identify, which I think is kind of the approach you're suggesting there, which is if you have a fundamental technical issue, it probably applies to most of the pages at least with that template. So that was a nice step forward because I know the new version of Search Console hasn't particularly been as popular as the old version as some of the data is a bit more opaque than it used to be, but that that was a good step forward I thought.

So, while we're talking again still about differences between big site SEO and small site SEO, one thing I know we've spoken about a few times and I'm a big proponent of, is the importance of internal linking on these sites. So I mean, what are your thoughts on you know, how much more important is internal linking for big sites as opposed to you know link acquisition is always on the sort of top of the list of what people want from SEO and it's a really difficult thing to do and sometimes it is overlooked I think on even small or medium size, what are your thoughts about internal linking?

AS: Yeah it's hugely important! I would go as far as to say that if you get the internal linking right on a really well-established brand, it can greatly reduce the amount you actually need to do more link acquisition right, because what it really means is it you're taking that domain authorities that's there and using it way more effectively.

So Expedia - and actually I've done this at Cheap Flights and to an extent on edreams as well - we approached internal linking in a different way. So the way that we looked it is we said that if there's a 1.2 million pages here, we first of all want to make sure they're linked to correctly and that might be through the main navigation, so as long as you've got like a logical navigation that goes through the different facets of pages, that's great. But actually what we've done at Expedia is, we imagined that we had a site map and we cut that site map up into pieces and placed across the pages, so we had a batch of internal links that would sit across every single page. We've done it in quite a smart way, so we came up with an algorithm that would match demand to internal links, so we would effectively say, okay well what is the search volume of flights to New York compared to hotels in Timbuktu. It's like we would then we had an algorithm and a system that we came up with that would effectively increase the number of internal links, based on the demand of that particular keyword.

Now this is you know, we're talking years ago that we've done this but I still think the logic stands to reason, it's like you're telling Google, hey this page is more important - you know it's more important because there's a greater demand for it, therefore there should be more internal links that point to it.

So there's other sites that do this quite well, I think HomeToGo is a pretty good example of the way that they approach internal linking, we've got lots of internal linking happening on their destination pages and TripAdvisor had a technology for this, they built a technology for this after we got it right on Expedia, so yeah it's hugely important but I think you can think of it this way, it's like if you've got a page in your information architecture, don't necessarily think to yourself that this is a crap page, you know this is a page that is quite weak in content, it's quite poor. We tried to think of it as, ok what does that page already do? You know, it's like does it already get some traffic? Yes it does. Okay well therefore then, that can contribute to something else in the slice, so what else can it link to? So for us it was like, rather than cut down pages and try and actually cut down the content that sat on the site is like, okay how do we leverage it better? Right yeah okay we know so there's always gonna be weak pages on a big site, you're always gonna have some thin content, so it's like thinking about how do you improve the pages themselves, the higher up in the hierarchy? How do you improve the content? How do you improve the template?

So it's like the minimum viable page, make sure it's decent enough to be in the index, so it's kind of justifiable so that if it ranks and you get traffic to it, it's like you're not looking at it thinking my god why does this page, why is this page ranking and then that gives you the opportunity to take that page and link elsewhere with it.

MC: You touched on very briefly there and I think I picked up on something, about the current trend of pruning pages - what are your thoughts on that? So for those who don't know, there's with some of the recent - I think you could put them under the umbrella of quality updates to Google - there's been a lot of advice going around about essentially just binning pages that aren't getting traffic or aren't performing and I've personally seen some very bad examples of this where all that needed to happen, in my opinion, was that they could have just improved the pages, but you kind of touched on that and you were saying rather than cutting the amount of pages you were looking at changing them?

AS: Yeah I don't think, I think the trouble with - I'm not saying that you shouldn't do that, I think there are some examples where it absolutely makes sense to do that, however I'm more of the school of thought where it's like you're thinking, there's an opportunity out there and it's like, how do I get a slice of that opportunity? Potentially I'm going to cut that opportunity if I remove a bunch of pages which may be capturing some of it.

Yeah so I'd much rather think about how to actually improve these pages and the decision you know, once you've assessed it and you say okay look, does it make more sense to invest my time in improving the content for these pages over here which have already got lots more demand versus investing a bit in these pages over here which are maybe got smaller demand right? So it's kind of longer tail stuff versus torso or head right. Now you're going to come to a decision based on analysis and based on sound, okay what's the opportunity here? Is it worth doing this? Maybe you will come to the decision to cut it but I think you can also think to yourself okay you know, is this content so bad, are these pages so bad, that they're going to have a negative overall effect on the site?

My personal opinion is that a page needs to be really bad for that to be the case. It needs to have really bad loading time, very very poor content and so I think if you're in that situation, fine maybe it doesn't make sense, but I think you'd be surprised at what you still see in the index right. If you go a little bit deeper than the first couple of pages, you do still start to see some quite poor stuff in the index and so the question is, are those pages really completely damaging the site? I think if you still getting traffic to it and you can find examples of them still ranking on page one, especially in the top five positions, then for me I don't challenge that and so maybe it's not really holding the site back.

MC: So that was a much better answer than ‘it depends’. So how then, if we back up a little bit and talk about again big sites versus small sites and giving you a general question, how much do you think, so SEO obviously changed a lot over the last several years, how much do you think it's changed for big sites compared to small sites over the last few years? And I'm asking you this specifically because I saw a lot of people bemoaning the fact that ‘Oh Google only ranks brands now’, ‘it highly favors brands’ and I think part of that has to do with we've definitely seen that the focus on quality over quantity in terms of links which are, you know, by definition I think easier for big brands to get than smaller brands and I'd be interested in your thoughts on how much of this entity recognition stuff comes in with Google, where they start to understand ‘okay this name is a brand and it's mentioned here’ and how much of those factors are weighing up and how does it impact maybe strategically what you do now, compared to what you used to do?

AS: Yeah okay so I don't, for me I think it's been about Google understanding of content that's probably changed the most right and yes, I think the the domains which are already really strong, it's very hard in an index which is so heavily dependent on links to to shift that, right? Yes you can reward content and absolutely I think you know Google is rewarding content but quite often what you see is as well is that the biggest companies have good enough content so it's like, are they rewarding the content or are they rewarding the domain, well it's you know, it's a bit of both.

I think one thing that’s changed is the importance of the quality of that content and I think actually what you find are quite a lot of sites that perform better are the ones that have a really good UGC, so they're really leveraging that UGC and actually you know, that in itself is just completely helping the amount of content that you've got. But I will say, I think this is kind of similar to and sort of relates to the last question about culling pages, so you know there might be a situation where you think that it's just going to be too hard to actually put meaningful content around this particular subject on my website, so maybe it makes sense to cull it, but I think the difference is it's like generating useful meaningful content across a very large set of pages is much much harder than it used to be. It used to be that you could throw up a set of pages and probably just pull in some like generic content, stuff with keywords that kind of made sense if your user read it, they wouldn't take any offense to it but it would be really similar a bit of content.

MC: That's the bar of contents, that they didn’t take offence.

AS: Exactly, but that's kind of how it must be. So you could take a big website especially if it's a website that had half decent link profile you could take that get that content index and get it ranking. Now, it's very different, it's the level of the depth of content that's required, so it's effectively like ask yourself how can I do skyscraper content across 5,000 pages you know even if you've got a website which is a million pages, there's still gonna be 5,000 pages which are super super important units. like okay how am I really gonna get very very good content here and that for me is I think what's changed the most. is like before those five thousand pages would have better content but now it's kind of the bar for that content is just getting higher and higher and higher, so the website the kind of larger websites there are, I think you'll see this with like the work they're doing around like trying to trigger featured snippets and improve better content around those type of questions and answers' sort of focus.

So those bigger websites are kind of putting that in place and you see them triggering more and more featured snippets and I think that's gonna be, that's the kind of generation of content is still happening, it's just happening in a smarter way. So it's like, how can I leverage data to generate content about all of these different things on my website? So how can I use... so I’ll use travellers as an example again, but how can I take flight information data, generate useful content for that that actually users going to find valuable. It’s like, how do you do that and then how do you add in the more curated, really deep content as well?

MC: Okay so I have a question on this for you which to stay fashionable, I can relate to BERT because I have to get at least one question or it wouldn't be an SEO conversation now about BERT. So we spoke about this the episode before last, so if you missed that the canned version - and feel free to chip in if you think I've off pieced with this - the canned version of BERT is basically Google will understand the intent behind user searches better by understanding the order of the words basically.

So it opens up, maybe another avenue for for site, so the question I have for you is, we spoke before this podcast very briefly - bit naughty of me, I should have started recording really if we're going to talk about it - I spoke to you about BERT and I was saying as I did on the episode before last, maybe if Google is understanding the user query better, it presents an opportunity for smaller sites to rank, whereas the big sites that may be ranked a single page, more generic page from multiple terms through their kind of link profiles and that you know Domain Authority for one of a better word, maybe now if Google's decided actually this page that's not as important on this site matches the query better, maybe they would rank better and the discussion we had on that was, you know you've made the very quick and correct counterpoint of well you know these big sites normally have that spread of content as well because like you said, there's a huge amount of time, effort and money going into producing that but it does strike me that you said one of the issues here is essentially if there's this greater focus on content and quality of content, that's an incredibly hard thing to scale because at the moment it's still basically a human thing to do really well. It's very hard to generate, unless it's like very data-driven stuff like really good content.

So I guess my question is do you think BERT and similar updates like Rankbrain maybe give smaller sites an opportunity in that, if I see a big sites got a whole set of pages that are generated and put together and are pretty useful, if I think okay right I'm gonna get these experts in and really smash it and just completely trance them on content quality, you know so I'm just picking my battles with certain pages, do you think the new algorithm might give them a snowball telling chance to rank or do you think we’re still going to see more brands dominating search?

AS: I really hope it's that way but I don't think it's going to be that way because I think you'll find that the larger websites, take Amazon as an example you know, Amazon have got so many pages which are effectively going to answer a query in a slightly different way, it's like why wouldn't Google just rank the other Amazon page over the first page that they were ranking for Amazon anyway?

I think the other thing to consider is like when we're talking about human content, it's the bigger companies - they have the money to invest in a huge amount of content; so we were using a company called - well we were using lots of different companies for content at the different companies that I've worked at and we have the resources, well the money to buy that content right. If we want, across the five thousand pages, we want to have at least I don't know 1500 words of decent content. Well as long as we've got a really good brief for that and we kind of know exactly what content that's going to be, we've done the research, we've kind of really thought about it, we can then farm it out to a crowdsourcing provider and get that content. we can also supplement it with lots of different datasets and I think this is a challenge right, is that I don't think the, I think the online world is sort of shifted from being people that were experts in a sector that had a website, to now being more like tech companies who you know, whose service of particular sector. It’s like you're competing not so much with - so if you're in travel - you're not just competing with other travel companies, you're competing with tech companies that do travel and I think that's the sort of challenge.

So I think around like automation of content - I think we're gonna see more and more advancement around that. It's like we know that Google is kind of understanding content better, so yes we're always a few steps behind Google, but we can leverage the same technology to start generating content. I think there's lots of news providers that are doing a pretty good job of this, you know Reuters generate a huge amount of content and it's like, is it relevant? Well yeah it is.

MC: That always comes as a huge shock to a lot of people I think, when you actually show them that the some of the breaking news stories; so things like sports results, earthquakes, where there are these api's, where they can just get the information instantly, those articles are written and published instantly by robots because we know the timing of news articles is really important, you know it's the first one to publish wins. And actually I don't think many people are aware of that, that a not insignificant proportion of especially the breaking news stories has had basically no human interaction, it's just been put together through data and published.

AS: Yeah exactly and why won't this advance more? I don't see any reason why it won't. I think we'll get to the point where you know, it will become a much more commonplace technology, so more and more you know companies will be using it for that. And yes, I'm sure Google is going to get smarter about kind of differentiating between all sorts of real content or what’s like generated content, but I think if you're doing a good enough job, kind of mashing it together, it's like why does it matter?

MC: I don't think we should go into it too far, but the thread interests me in that if we're getting machines to generate this content then they're all going to be using similar or the same algorithms training sets and it comes to the point where they'll all be kind of writing the same content anyway, so they will only need to be one source for that kind of content. But we will shelve that now because that's a long conversation.

I think we're over half an hour now already, so let's wrap up with something a bit more juicy. So in your opinion and I'm not asking you to name names, your general opinion experienced in 2019, how much kind of black hat SEO, negative SEO is still taking place with big sites and what form does it take? Is it still with link acquisition? Is it with PBNs? Is it sort of technical stuff that's happening?

AS: Yeah I think it is.

MC: Andrew is sweating, you can’t see this but he’s broken out in sweats. AS: Oh come on, you know more about black hat than I do, some of the things you were doing back in the day! So I would say yes, there is still a lot of black hat stuff happening. look you know people can - I wouldn't even really class PBNs as black hat.

MC: Well, Google would disagree.

AS: Yeah I'm sure Google would disagree.

MC: Do you think - what I think is an interesting question - do you think some of the stuff that's happening - so if we just for now define black hat stuff as Google doesn't want you to do that and would penalize you for it if you explained it to them, do you think a lot of it is happening through intention or through ignorance? Through people just saying well that's gonna get us the call just do it without understanding implications? Or do you think when it is happening it's people just being like we're gonna take that risk and minimize the risk because of the games are there?

AS: So no, I think it's intentional, I think maybe there's a couple of things that are happening. One, I would say some of the focus is shifted away from SEO towards social and really trying to exploit social media in a different way right, so a lot of great minds that were working on black hat SEO have kind of focused a bit more on social media, but I would say there are still some things that are happening that are weird.

One thing that I've spotted and I haven't really spent a last time digging into it, but one thing that I've noticed is a lot of websites, when you start looking them in ahrefs, they have a huge number of link's where just a full stop is the anchor text and it seems to be some kind of like image; it's related to image search as far as I can tell. But basically what you'll find is that their images are being ripped off, placed on a page which you've kind of mashed up a whole bunch of images and their way of linking back to the source of the image is the full stop at the end of a sentence, below the image and I see this happening on more and more websites and you look at the link profile and ahrefs shows you like, oh look it's like 20% of the anchor text, it's just a full stop, it’s super kind of strange.

So I think that this something going on there, that I'd like to spend a bit more time looking at that seems to be happening. In terms of negative SEO, well there are some things that I've - I don't know if I should talk about this Mark - but there's something that we've done that was kind of negative SEO, which was sort of a reputation management thing that we've done with 301 redirects ages ago. That was I think are quite legitimate reason because there was a very old news story that started popping up in the top five of search results for a brand and we generated a bunch of redirects, turn them on and off, created a kind of weird looking link profile to the page and the page then drops and tanked. Why wouldn't that work now? And I think maybe these things are still happening.

In my experience, I've had a couple of things that have happened to me that I'd be surprised they didn't still work. So one, when I was at cheap flights we had whole bunch of paid links that started to be pointed to us they had absolutely nothing to do us and in the end to resolve that, I actually contacted fiddy and Casper and they got in contact with Google on our behalf because there was no manual penalty for it but we lost I think like, 10 to 15% of traffic after these links came about.

MC: Was this pre disavow?

AS: It was around about the time of disavow right and so this kind of like paid link footprint that got created through these links, yes okay because it had nothing to do with us, it's obviously a bit harder to get them removed. And yeah, after this following up through the backdoors into Google, it got resolved and then afterwards we had to disavow a whole bunch of stuff there.

Another example, when I worked for a different company, it was in retail and they had widgets and this goes back to your example of, is it through fault of their own or is it intentional? This one was unintentional but they add lots of review content that was being hosted externally on other websites, so it their content that was being shown next to products on other websites and one particular website decided to hide the links that referred back to that brand, so they just made them hidden links and they lost 30% of their traffic overnight. It took a couple of weeks to actually figure out what the heck had happened and once we asked them to put those links back in place and make them not hidden, the traffic came back immediately, but there was no penalty, there was nothing in Google Search Console to say, okay that you've been penalized, it's just a kind of weird algorithmic change that's happened as a result of those links. So yeah I think that probably, I think the other thing right is these services are for sale and I don't think there would be websites that are dedicated to negative SEO, if you didn't do something right. so if you want to go and buy negative SEO, you can find a whole bunch of websites that are going to do that for you. Maybe they're all just con-artists and it doesn't work but you know.

MC: Yeah you're not exactly gonna complain are you? Who are you going to file your complaint to?

AS: Exactly, but I think it does still work and I think when we like PBNs for me, yes okay it's against Google's guidelines, but I think the trouble with it is it represents a really good return on investment that's I think the problem is that because it works it's gonna continue to be an issue and I think until Google - we're probably not going to see this kind of this playing field where content is much more rewarded until we get to a point where Google's understanding of content becomes even better. So maybe in that kind of utopian world that we might get to, it will probably just be down to Google's understanding of content that means they don't have to rely on the link signals quite so much and then who knows, we'll be doing some sort of clever blackhat content things.

MC: Andrew thank you so much for your thoughts, opinions and experience. I look forward to hearing your talk this evening at Search Norwich and we will probably put the video of Andrews talk up with the podcast. You can get all the notes from this episode at search.with That's everything for this episode, we will be back next Monday which will be the 18th of November and I hope you have a great week!

More from the blog

Get in touch

Please call us on
+44 (0)1603 957068

or email
[email protected]

Alternatively, if you fill and send this form we will get back to you shortly:

Artboard Created with Sketch.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.


Thank you for your enquiry, we will take at look at your request and get back to you shortly.