Candour

Episode 42: Avoiding SEO disasters with Steven van Vessum

Play this episode:

Or get it on:

What's in this episode?

In this episode you will hear a SearchNorwich recording of Steven van Vessum giving his talk "SEO disasters: the good, the bad and the taboo". Steven explores some real-life SEO disasters and talks about how to put in place processes to both monitor and deal with issues before they become big problems.

Full video:

Slides

Transcription

MC: Welcome to Episode 42 of the Search with Candour podcast! Recorded on Sunday the 5th of January 2020. My name is Mark Williams-Cook and it's our first episode in the new year, so Happy New Year and welcome back to those that are returning back to work today.

In this episode, we are going to be talking about SEO disasters; how to prevent them and to do this you'll be hearing a SearchNorwich talk by Steven van Vessum, who is the co-founder of ContentKing.

I actually hadn't met Steven before he came to SearchNorwich, I'm really glad I did apart from being a really nice guy, I'm actually now a ContentKing customer, I won't give you any spiel about it but for those who've never heard it, it's a real-time website monitoring tool with a focus on SEO and it's definitely one of the slickest tools I've seen. As I said I've signed up and I'm pretty picky nowadays with the tools I use, Steve's a really nice guy as well. his talk will take you through some real-world examples of SEO disasters he's encountered, so before Steve co-founded ContentKing he was actually working at an SEO agency so he's seen a lot and he actually goes through how to put in process ways to monitor and prevent these disasters.

In fairness it's an area a lot of SEOs are weak in and by that I mean people will do competent audits, they'll get changes implemented but they fail to have a system in place for when a client moves a page or the development agency maybe pushes an update without informing you.

It's a great talk and we have another SearchNorwich event lined up already for next Wednesday the 15th of January. I'm hugely excited about this, we actually have Aleyda Solis coming to talk about Marketplace SEO, so that's getting your products to rank in all the other places apart from Google. Hannah Rampton is also coming, if you haven't heard of Hannah before you're also missing out. Hannah has produced some incredible SEO tools based on Google sheets and she'll be talking you through those. It's a free event, Wednesday 15th and you can register at searchnorwich.org. Here’s Steven.

SV: We're going to be talking about SEO disasters. Basically all of the things that can go wrong when you're doing SEO.

First of all when I go to meetups for conferences and I sit in a talk, I always ask myself like why should I listen to this person? So I'll try to convince you, but to get started you may notice I have a bit of a weird accent, it's a blend of Dutch, British English and Canadian English from a dark past, so humour me. Who am I? I'm the VP of community at ContentKing. That basically means I'm involved in everything that has to do with the community, I do a lot of writing, promoting and talks like these. I used to run an agency in the Netherlands, did that for six years and at some point we got fed up with that because we saw that websites kept on breaking, clients were deleting content, stuff happens. we would find out too late but since we were the ones responsible for SEO at the end of the month we make the monthly report and client would ask, hey guys what happened? And yeah we couldn't really get away with, oh yeah but you deleted these important pages a couple of weeks ago, they would counter with, okay but why are you only telling me now? You should have told me right away and that's the kind of discussion you're never gonna win because even if you win the discussion, the client isn't gonna be happy.

So, I vented to my business partner Vincent and he said Steve, I know how to fix this and I was sick for a couple of days, I came back to the office and he had built a prototype of what ContentKing is right now. I’ll tell you a little bit more about that. I love to read and write and talk about SEO. I do that at Search Engine Journal, a site you may know, Content Marketing Institute and CMS wire. I spent the majority of my time writing for our own Academy, it's basically where I write about a bunch of SEO topics.

So two weeks ago I published a guide on Magento, Magento 2 - how to get it in shape, that's basically how I spend most of my time, headphones on, doing research, trying to break stuff and yeah basically describing how not to do it. I'm involved in organising SEO meetups as well. I organised the biggest SEO meetup in the Netherlands, which is a lot of fun. We're gonna have one in a couple of weeks, so if you like the Netherlands, you like SEO, come down, hit me up for a ticket.

So without further ado, ContentKing, it's a real-time SEO auditing and change tracking platform, it's a lot of difficult words. what we basically do is we keep track of websites 24/7, we look for issues, we look for changes. We know from our experience what impacts your visibility in organic search and when important stuff changes or issues get introduced, we alert you, so it's basically a watchdog.

So why SEO disasters? I have an interest, some people call the fetish, I like to call it an interest in SEO disasters because I I think it's fascinating to see why certain things go wrong and how they happen, how they could have been prevented etc. etc. So I love to hear and talk about them, it helps me to become a better SEO but it also enables me to teach others about the things I've seen, the things I've heard and to basically share all of that. So hopefully other people won't have to make the same mistakes, I'll settle for at least a couple.

So when something really happens, like an SEO disaster, a drop in traffic, rankings drop, I always look at what happened and what the results are, why did it happen and how it could have been prevented because if you don't ask yourself the last question, you're gonna run into the same issue over and over again. What's really important is a mention on who's to blame for stuff like this - it's always the damn developers.

Laughter.

No I'm kidding, that's actually not true because SEOs love to blame developers for things but often it comes down to like a mix of responsibility because the developer could say, hey but you should have trained me, why didn't you have like a team meeting before the website migration where you explained all of the the important stuff we had to look out for when we're going to be doing a migration. So when it comes to SEO disasters, it's always multiple people involved, there's rarely only just one person or a team that's to blame, so that's really important.

Take responsibility and a note about SEO disasters, it's people like I'm sure you've seen it, you go on social media and people are posting their SISTRIX charts where traffic is going up and it's always going up no one's posting screenshots of their own data where traffic is going down, no one's really talking about all of the things that went wrong. So I'm trying to break the taboo.

Therefore the first example, the first SEO disaster I want to discuss with you it's one I initiated myself. When we were running our agency, it was Friday afternoon and a client wanted to do a launch and we knew it wasn't that smart but it wasn't such a big deal in our opinion so he went ahead with it anyway. We were dealing with a multilingual site that was a English, Dutch, German site. The Dutch site was most important and they were migrating from HTTP to HTTPS and we did that on a Friday afternoon and I see people shaking their heads because it's silly idea, but the client was pressuring us and we were like we've done this a bunch of times, what go wrong? That's usually how these stories start.

So we accidentally - I actually asked our intern to handle this, it's not that hard just do it like he did his job and he said, hey guys it's 4 p.m, let's open up a couple of beers and celebrate the weekend so we did that. But we had accidentally redirected the German and the English site into the Dutch site so the English and the German site weren't accessible and users couldn't access them, but for search engines it looked like these three sites were actually merged into one and the English and the German side were just gone. So from an SEO point of view, this is really really bad and as we were celebrating the start of a weekend, our client rang us and was like, hey guys I'm trying to look up the English site and it keeps sending me to the Dutch one, what happened? And those are always the dreadful calls that you wish never happened because yeah you just saw, as we say in Dutch it looks like you're standing in your singlet, you're just you look like a fool, because this is the kind of stuff you are responsible for. So we fixed that within three hours and luckily it didn't have noticeable effects on rankings and traffic.

But nonetheless it put a scare in us, it scared us and we had to rethink the way we were doing it. One of the things we obviously had to re-evaluate is where we would cave under pressure for clients saying, hey I want to launch on Friday afternoon. So how could this have been prevented? Having a better test process would have definitely made a change, it would be very easy to see that the English and German sites were redirected to the Dutch one, so testing for sure. Monitoring for stuff like this as well, this is one of the examples we always talk about when we're discussing why we created ContentKing. It's stuff like this that if you have software running in the background that looks after your site's, you don't really have to worry about it. Sure you need to test, you need to have proper processes, you need to talk to your colleagues, you need to train them well but at the end of the day, there's more focus for actual work that can be done by humans and less repetitive, boring work. Not rushing important updates, so what we did was we said from Monday to Wednesday afternoon we're going to be doing releases and we're not going to be releasing anything on Thursday and Friday because if we do release on Wednesday and we need to have more time to fix it and if something goes wrong, we have Thursday and Friday and that’s usually fine. So this was our SEO disaster.

The next one up was brought to my attention by Jenny Halasz, she was calling to action by client and they had launched a press section on a website, so basically adding a new section to the site and when you're developing a new section of the site or new functionality, you do that in a staging environment and once it's approved you push it to a production environment, it's basically default release cycle.

So they launched the section, sent out a press release and it was actually very very successful, they got a lot of links to it, so that was great but the press section wasn't getting indexed and they were scratching their heads so they were like, hey what's going on and they call Jenny and said, hey do you mind, could you take a look of this and it turned out that the production environment had a canonical pointing to the staging environment meaning that from a search engine point of view, the production pages were being canonicalised to the staging pages, but staging pages weren't accessible to anyone - not to users, not the search engine, so all of that authority and potential rankings were just flushed down the toilet because search engines couldn't reach it, they couldn't understand it and that was the end of it.

So how could this have been avoided? One of the things that could really save this was education on best practices and this is more of a best practice from a development point of view but you shouldn't hard code your canonical URLs, you should always make them - depending on the environment that the site’s on - they would have solved this, this would have never been an issue. But testing would have easily caught this as well, even just manual testing or using some tool, it would have definitely stood out like a sore thumb. On page SEO monitoring as well, so tools out there will pick up new pages and evaluate them to say, hey are these pages in good shape or do they require your attention, oftentimes when new contents published there's always something wrong with it, could be a dead link or could be canonical to staging environment.

The next one up is a pagination gone wrong and this person didn't want to go on the record so that's why there’s an anonymous face. It's a good one. So the client wanted their pagination system on a e-commerce site to be more user friendly and shiny, so what they did is they introduced a very user-friendly JS driven pagination system, so it was relying on client-side JavaScript to work and it worked fine for users, but search engines still have issues with JavaScript. So what happens is, a search engine crawler comes to the site and sees that a big portion of the site is relying on the execution of JavaScript to be able to function, so what they'll do is stop what they're doing, send the page to the rendering queue which is really long because

rendering JavaScript is extremely expensive for search engines, so if I were to have to guess I would say just going through a regular HTML page versus rendering a JavaScript page is a hundred times more expensive, in terms of server resources. So resources are limited, Google is getting pretty good at this but in this particular case all of that javascript needed to be executed and only after executing that and getting the full page with all of the links to the products, they could follow those links, so that slowed down the whole crawling and indexing process. So the result was that the rollout of this pagination system meant it was a massive change to the internal link structure - the internal link structure is basically the the sum of all the internal links within your site - for an e-commerce site the pagination is oftentimes one of the few ways to reach products. So a lot of these product pages didn't really get a lot of attention anymore, they were getting crawled less, not indexed as well and they were dropping from the index and if you're not indexed, you’re sure as sh#t not going to rank, so that's not a good thing.

So that's Googlebot and that's the traffic going down over time. so again how could this have been avoided? education on best practices, it is widely known that search engines don't really deal well yet with JavaScript and it just takes more time for them to eat through all of that JS content and try to make sense of it. Search engines claim that they're really good at it but the fact of the matter is that they're a long way from being really good at that. in some cases John Mueller from Google, he mentioned a couple of weeks ago that it could take up to three, four weeks for a new page that heavily relies on JavaScript to be crawled and indexed and if you're pushing new content, so for instance say I published an article and I want it to be indexed as soon as possible so ideally like same day or the next day something like that, can you imagine having to wait a couple of weeks for your content to be indexed? That means you're not able to compete with your competition etc. etc. so just a matter of picking your battles and this is definitely not one you want to pick.

There's very good ways to deal with this. So you could for instance use server-side rendering or pre-rendering, there's all kinds of services out there. You can do a hybrid approach, you give Google HTML and your users JavaScript, so there's no way for this to really go wrong - plenty of options. Proper testing would have definitely picked this up as well.

The fourth example was brought to my attention by Patrick Stox and he was working on an e-commerce site and they had some JavaScript magic going on in the head section of the page. So if you look at an HTML page, you have the head section and it has stuff like a page title, meta description, basically the things you see in a search engine result page, it has a canonical, meta robots tag, so your preferred guidelines around crawling and indexing and a bunch of other stuff and what's actually in there is really really important. Search engines look for these tags within the header and not in the body and what happened here was because of the code in the JavaScript code in the heading, some of the content was pushed down to the body, so the head section was prematurely closed and the canonical ended up in a body and when Google comes across this, they'll just ignore it because it's not according to the guidelines, so they just ignored it. but for an e-commerce website this can have a really big impact because there's so many ways to reach a certain product, so for instance, if you're selling shoes, shoes can be available in I don't know four or five colours, ten different sizes, maybe even male and female types, so there's a product could have, I don't know 30 40 product variants, and oftentimes they’re canonicalised to the main product because at the end of the day, that's the product that's gonna get all of your your internal links and that's the one you're ranking with.

In this case you basically created 30, 40 duplicates per product, so from a duplicate content point of view this was a big disaster because you just create more confusion, there's no such thing as a penalty that Google gives you for having duplicate content, that's the thing of the past. But when you're talking about giving search engines the right signals on how to crawl, index and understand your site, you want there to be no confusion because SEO is hard enough as it is, so yeah this wasn't really helping their case.

So the result was Google ignored the canonical URL and they created a lot of duplicate content and again education on best practices would have definitely helped prevent this from happening, but testing as well because this is something that if you - so if you were to manually check this, you could miss it but if you were to switch on any SEO tool that does some form of crawling and yeah you would have definitely picked this up.

The fifth and last SEO disaster is actually one that just happens to your head. so who knows what the disavow tool is that Google has? okay couple of hands, alright cool. So Google allows you to basically say I want to devalue these links, I don't trust them, don't count them for my website authority basically. So I don't know if you have some dodgy Russian links and you’re not really happy with them, you don't want to risk potential penalty, you can say okay I want to disavow these. 99% of the sites will never have to use the disavow tool because Google has, for the last two years off the top of my head, they have been devaluing dodgy links for you because that's just how the internet works, there's lots of automated scripts that generate pages, generate links and there's nothing you can really do about it, it just happens and Google said, okay we're going to devalue these links and only in case of deliberate manipulation, link manipulation they would take action.

So there was a lot of SEO news going on about the disavow, a very prominent SEO figures were promoting using the tool, if it only looked if the link looked remotely dodgy so this person went at it with a hatchet and he or she cut away much more than they should have really done. so in reality a lot of these dodgy links actually still add value and if you cut away everything that looks remotely dodgy you're often cutting away way too much. When it comes to disavowing it's really tricky, you can easily basically tell them to discredit much more than you really wanted to.

So how could this have been avoided? Well critical thinking for one, but also going to meetups like these. Wo when we've done our talks and you grab a drink, you can basically tell someone hey I'm dealing with something, can I get your opinion on this, I have some dodgy links what do you think I should do? Are there any good articles about this topic you can refer me to? Etc. and it’s not just in real life. There are plenty of Facebook groups and Slack groups. Hell, Mark here is running one - Slack workspace SearchNorwich, it's really good, there's a lot of smart people in there discussing SEO and it only takes you a minute to really explain what's going on and yeah you'll often get good feedback.

If we're talking about minimising SEO disasters, what does it really come down to? Education on best practices, for sure. Involving SEO specialists from early on, not when sh#t has already hit the fan. For instance, yesterday I was talking to another SEO and he said, Steve it happened to me again, it's like what? Yeah, well a client came to me said, hey we want to launch a website redesign and they said, yeah we want to do it in December and it's e-commerce site and the holidays are really important to them and he said, yeah so it's December next year right? And they said, no no no, it’s December 2019 and he was like how do you think this is going to work out? And that's the thing that clients really need to work on and even though you're really experienced at SEO, it takes some client education as well to say, hey if you're gonna think about rebranding or redesigning or changing domain whatever, just let me know in advance and I can already consult whether it makes sense whether it's something that makes sense from a SEO or even the business point of view.

Proper testing. Having test processes and people actually doing it. a lot of people think oh I can do this, hold my beer and just put it live and that's it and having monitoring systems in place to basically make sure that all of the stuff you didn't check or that you missed is basically caught and you get alerted about that.

In order to keep on learning I wrote a couple of articles about SEO disasters, so if you're interested in this and see whatever funny stuff happened, check them out. As Mark said it's being recorded, we'll share around slides there on this website as well, so no need to write anything down and memorise it, you can just check them out later.

Don't let SEO disasters surprise you and to help you with that, I put together an exclusive offer for SearchNorwich - if you want to take ContentKing for a spin, we can get you an extended trial so the default trial is two weeks, I can make it six weeks if you're interested and that's it, thank you very much.

MC: I hope you enjoyed that talk. You can find the video of Steven’s presentation, his slide, transcription and links on the show notes at search.withcandour.co.uk. If you're near to Norwich, don't forget to come down and meet us at the next SearchNorwich meetup on the 15th of January. You can register at searchnorwich.org. Search with Candour, we'll be back next Monday the 13th of January, have a great week.

More from the blog