Episode 68 - SEO site audits and Sitebulb with Patrick Hathaway

Play this episode:

Or get it on:

What's in this episode?

In this episode, you will hear Mark Williams-Cook talking to Patrick Hathaway, director at Sitebulb about SEO site audits, technical SEO and what is to come in Sitebulb. Together they discuss: How Sitebulb came to be, Patrick's thoughts on technical SEO and Google as a source of truth, benefits of cloud vs desktop tools, internal link optimisation, trends in technical SEO and why the f*** Patrick swears so much.

Show notes Sitebulb:

Will Critchlow talk on robots.txt parser:

Oneley blog:

URL profiler:


MC: Welcome to episode 68 of the Search with Candour podcast! Recorded on Friday the 3rd of July 2020. My name is Mark Williams-Cook and today I'm going to be joined by Patrick Hathaway, Director of Sitebulb and we're going to be talking about SEO audits and Sitebulb. Patrick, welcome.

PH: Hello.

MC: Thank you very much for joining us. I always like to start these podcasts where we have different guests on with the assumption that nobody has a clue who you are or what Sitebulb is, so do you want to give us a 30-second or so background of yourself and one line about what Sitebulb is?

PH: Sure thing. So I'm Patrick Hathaway, co-founder of Sitebulb, which is product created by Gareth Brown and myself. Before Sitebulb we co-created a product called URL profiler which we actually sold last year, and before that I worked both agency side and in-house. And in terms of the elevator pitch for Sitebulb, we had a new member joining our team early this year called Geoff - Geoff Kennedy and almost the first thing he did was he pointed out, in his interview in fact, was that our positioning was all over the place and he was right about it. So he and I carried out a positioning exercise, it was the first thing he did pretty much, we spent a few weeks doing that and we needed to write out all sorts of different lengths of elevator pitch.

So I'll read one out to you now. So, Sitebulb is a website auditing tool that provides comprehensive corner analysis and presents recommendations that guide your audit workflow with intuitive next steps from investigation to reporting. In-depth explanations of every issue help you fully understand the results and have confidence in your recommendations, which makes Sitebulb a great fit for SEOs who regularly need to do website audits. And by the way, that's our elevator pitch right there and the positioning process we did was actually super helpful, both from a messaging and product perspective.

So I think we'll talk later on about some of the new features we've got coming soon but we've also had an overhaul, like various areas of the website make our positioning clearer, and in general easier to find answers to your questions, which i think is now ready and should be ready to launch next week. So there'll be some of that stuff that will actually be going live soon.

MC: Wow, that was really well prepared. I was expecting some off-the-cuff thing but you have a prepared positioning statement, so that's perfect.

PH: Well, yeah I figured I needed to actually use it literally we'd made it and I've literally never used it was like right this is my opportunity.

MC: Your time to shine. So you know that I use Sitebulb, I'm a fan, I use lots of different SEO tools and I guess I'm interested in your thoughts on how Sitebulb came to be? Because obviously we've had many tools over the years ranging, from very old stuff like you know Xeru’s Link Sleuth, through to Screaming Frog which is obviously very popular, what made you go ah, there's a gap there, there's something that's not being done?

PH: Well it actually started as a feature for our first product URLs profiler which for anybody that doesn't know that product, it’s a list base or data grabber. So in that tool you'd load a list of URLs in and then you'll say right, I want to collect this bit of data that bit of data, that bit of data about the URLs and then it would go off and get it and that then kind of dumps it into a spreadsheet. And essentially this meant we had to constantly say to our users go and grab a list of links from ahrefs or majestic or go and grab some crawl data from Screaming Frog, so it meant that our tool there was always ever like a supplementary and you had to have something else in order to use it properly.

So with that, we figured right, we can't exactly create our own link index, you know a little two-man and self-funded team we're not going to do that, so we're not going to build ahrefs, but we could theoretically build a crawler. So we started off on that route. essentially it began just we'll just go and like get the URL data, essentially that URL discovery we’ll just go and get that and you know some basic data, then as we were developing it we kept adding in things that we thought the other crawlers should be doing anyway. So all the stuff which we wanted other crawler tools to do, we just added to our tool and the more we did this it dawned on us that to do this thing justice, it really needs to be its own product. So that's like how Sitebulb was born.

MC: Was that a challenge then? Actually building a crawler? Like I mean I've dabbled with various tools, you know building tools I've needed and I've always found these things sound simple, but actually when you start getting to the nuts and bolts of it, there's all kinds of edge cases and stuff that basically breaks it. Was it a fairly simple process to get the basic crawler stuff up and running or was that more challenging?

PH: Yeah, I mean it's essentially as you say, to get the sort of core will it crawl our website part is not super difficult. I mean get garrus the one that had to do all of this so I'm just sort of saying it, but to actually get to do the sort of basics you know it's not that difficult. but to get it to do the edge cases is really where you need to do a lot of work and that's where we needed a lot of help from beta testers, you know finding these sites which it just wouldn't crawl. and so we I mean if you if you had asked Gareth, particularly back in the day like not so much anymore, but his motto when we were building it was the Internet is broken and like that this you have to do so much work fixing HTML, like understanding what things are supposed to mean because the HTML on most websites is just awful, and a lot of the work to make it sort of a robust and reliable is allowing it to figure out these things that people are typically doing wrong or CMS's are doing wrong and fixing the things you need to fix, so that you can still like parse the HTML or grab the elements that you want to grab. and that's that looks like the kind of the traditional crawling method which is this or you know grab the HTML and past that.

The other side of things was the JavaScript rendering. So as we were building Sitebulb, initially we used a headless browser called Phantom JS and right towards the end of our development process, this was - I can't remember exactly the day - but it must have been a couple of months before we were planning to launch which was September 2017, Google actually released the headless chrome and we were - Gareth in particular - was really, really keen to put that in and so I had to put the brakes on and say we've had months and months of beta testing with this phantom JS, let's not just rip that out and replace it with something which hasn't been tested. So we had to pull that back and version 2 basically was when we brought headless chrome in and that was massive because it meant essentially that any website you have, Sitebulb can crawl and that was a really big thing. And a lot of the work was making the tool work with the headless chrome and work for all these different websites and all these different configurations and you know, various different frameworks.

MC: I could see Gareth listening to this, fuming, as you've just said, ‘oh yeah it was fairly simple to get’.

PH: Yeah, dead easy.


MC: So one of the things that interests me, that I like about Sitebulb is actually that you do give these explanations of, so you found an issue and you'll actually explain why that's an issue, an attempt to prioritise that and obviously you know when it comes to actually doing audits and I think my views are quite well known on this, I'm not a fan of just the automated SEO audit tools because they've got huge limitations and even if you're prioritising stuff from a technical point of view, you need to know the internal capability and things before you can prioritise stuff. But I'm interested in, you've got all of these really detailed notes explaining an issue, this is why it's an issue, why did you feel that was something you wanted to add? Because a lot of these other tools obviously just give you the data, they're just like right that's broken, this canonical's like this, but you've gone that extra step and said, okay this is why this is a problem and this is why it's a big problem or a not such a big problem?

PH: I mean it kind of starts from where we began. When we launched, I mean you mentioned it earlier, when we launched Sitebulb we hadn't initially planned to be building a crawler, we kind of pivoted in that direction and when we were looking at, okay essentially we're now going to be a competitor to Screaming Frog, DeepCrawl, you know OnCrawl, Botify - all those guys, how are we gonna make ourselves any different to any of those and you know around that time there had been, probably done in the few years previous to that, there'd been numerous copycat tools, which to all intensive purposes were Screaming Frog again, you know with the same basic screaming frog functionality, not without all the bells and whistles, but the basic functionality and the same sort of UI and we literally had a philosophy of we are not building another f#cking crawler because nobody wants that, we don't want to make just something which does exactly the same as everything else and we had this assumption that basically everybody's got Screaming Frog and I don't know if that's a literal situation, but it was sort of an assumption we made.

MC: It feels like that sometimes.

PH: Yes,at least it's not far off and we felt we needed to create a product that offered something fundamentally different to that. So we double down on the reporting elements, the data visualisation, and things like the hints which allowed us to take our SEO knowledge and build that as a layer into the product. So even though a lot of the explanations for the hints live on the website, we feel that’s part of the product and it hooks into the product and it all fits together. So we really were trying to build a website auditing tool and not a crawler, so like that with all the tools that the sort with a crawler market, the core product is going to crawl the website and spit out data and we thought that we're going to crawl the website, grab the data, doing a whole analysis on the data and then spit that out instead. So there's still an inbuilt assumption in the product that there's an SEO at the other end using it and that's still a really important factor,like you say a tool can't do all the work for you, it still needs an SEO to interpret it, to understand what's important in the context of the specific website they're working on. And so what we've built is something that tries to give the SEO as many of the tools to help make those decisions as possible.

MC: You mentioned there about building in your SEO knowledge into Sitebulb and like you rightly say having an SEO at the other end looking at what the tools spitting out, and my question I guess is, how much have you learnt on top of what you already know about SEO while you're building this tool? I ask this because there was a question posted yesterday on Twitter from Gary from Google, asking what would happen in this situation when he posted a robots.txt rule and by the time I checked it, it had a few hundred replies and it was a kind of binary yes-or-no type answer, it was a loudest Laue and we had fallen about 50/50 on each side of the fence as to what Googlebot would do given these rule sets and to me, I guess it's kind of understandable in a way because it's a really specific question but on the other hand, that documentation is out there and it's just there for you to read and if you're putting in this advice and I know I've spoken to a team before about when we've hit edge cases with robots.txt like how much of you has been like - oh my like that's a new thing, when you've had beta testers and Google just drops an announcement on you.

PH: I mean it happens, I wouldn’t say it happens regularly, but it definitely happens where we've built something, with our understanding of the guidelines. so for instance with robots.txt in particular, we've essentially gone through all the guidelines and built our robots parser to sort of essentially validate against those guidelines. then every so often a customer will come to you and say, look this this URL isn't crawling and it should be crawling based on this rule and then you'll look at it and go well okay it shouldn't really see you like Google shouldn't really be crawling that, and like they've at some point made the decision well actually in this case we will allow that. and so what actually happens in real life seems to not necessarily always match the guidelines and if you see, I think Will Critchlow - I think it was his search live talk last year, whenever it would have been, I think October - he did a whole load of work and analysis on the differences between what happens in real life, what the guidelines say and what the open-source robots.txt parser they put out there and again, there's all sorts of inconsistencies between the tools and so there is no, at the moment I don't think that there's an ultimate source of truth on this sort of stuff.

So I think it's interesting, the question posed by Gary is an interesting one because to most people essentially you only need to know what you need to know for every specific website you come across, and if you come across something where there's a rule that you don't understand, the idea is that you go off and learn it and I think so much of SEO is learning on the job and we've absolutely had to do that and what we get the benefit of as a crawling tool is that we get our users coming to us, with all these weird edge cases - you know some of the sites that you've shown me, and we're just like, okay that doesn't do what we thought we should do or it's not doing what we expect it to do or you think it's not doing what it should be doing, and then we have to go and look at it and sometimes it's a case that we've put a rule in slightly wrong or sometimes it's a case of we have to just go right well actually that's just make a decision then that if these thing if these URLs are being allowed by Google, then we'll also allow them. It's not hard and fast, there's lots of gray areas in it and I think that's reflected in the responses by the community to Gary’s question.

MC: Do you know if that talk from Will Critchlow from Search Love is online at all?

PH: The presentation definitely is. So he did write up a blog post, so on the Distilled site sometime last year, you'll definitely be able to find that link.I can send you that link afterwards, but definitely a write up and possibly the SlideShare link in there as well. I think most of the time the Search Love videos go behind a paywall, so the video probably isn't but the talk I'm sure is.

MC: Cool, so we'll get the link up on the show notes at, if anyone wants to check out those slides. You mentioned some of the cloud crawlers there like DeepCrawl, how do you see the market going in terms of - because we've got various desktop based tools and I've seen a complaint of some of these tools is that they don't work on very large websites because they're using up memory and storage, and then you've got the cloud-based ones, where do you see this going in in the future? Do you think there's always going to be space for desktop tools? Do they do something fundamentally different in your mind?

PH: Yeah, I mean I'm naturally biased producing a desktop tool. I mean, I personally think that most SEOs, for most of the work that they need to do will be totally fine with the desktop product. I mean most people, most of the time, aren't crawling websites bigger than ten thousand pages and when you've got a site bigger than 10,000 pages, you don't necessarily need a cloud product and where the cloud products come into their own is when you have millions and millions of pages and you don't necessarily want your crawl of running on your local machine.

So that is sort of the big and obvious use case for the SAS product, the cloud product, you know that can run in the background and is not taking up your machine resources at the time. I mean we have recently built some sort of solutions to those problems within the software itself. So now you can do scheduled or recurring audits on Sitebulb and we have some of our like higher-end, more ‘Enterprisey’ type customers, who've got pages, we've got some websites with say million pages, what they're doing is they're creating their own server in their own office, so like a dedicated computer which they spend a few grand on and that essentially just sits there and runs all of their crawls. So they load their projects on there, they set recurring ones - so if they've got a client that they need to crawl every week, or every month, or every day, they just set them on recurring and it just bashes through them when it needs to, and then when they want to go and get the data for it, they just grab it from there and then they do the analysis.

So that sort of solution handles a lot of the problems that the cloud products are solving. When we built Sitebulb, we were really aiming for offering something which was a bit more than what the likes of Screaming Frog can give you and at a cheaper price than what the OnCrawls and DeepCrawls and all those guys will give you. So we were trying to aim for that middle ground in between, where we felt that at the time at least there wasn't a scalable solution in that middle ground and we, in particular, were keen to keep the price point essentially available to everybody. We always wanted our software to be accessible to anybody on the market and in terms of where we've priced it, we believe that we managed to do that and and again, it was a little bit of back and forth with the industry, trying to understand where that price point should be because we are more expensive than Screaming Frog but we're way, way, cheaper than most of the cloud products.

The big advantage of desktop tools over a cloud tool is that there are no limits whatsoever. So you don't need to have a five project limit or three domain limit or anything like that. You can use one of those cloud tools and get a hundred thousand pound credit, I'm sorry hundred thousand URL set amount of credits, and you can set a site crawling, have no idea how big it is, it explodes out of nowhere because pagination or something, you can burn through all your months credits in one go and have nothing usable for your client and that's where the desktop tool is just far better because all of that exploratory stuff you can do that, you're not getting penalised for that, you can crawl it and have 100,000 pages and you’re looking at it the next day and go oh sh#t okay, I'm gonna put some explosions in, I'm not going to crawl the query parameters, all that sort of stuff and then suddenly you've caught it again, you know maybe it's 10,000 pages this time, and you've actually got data you can use and you can do your audit with.

So I can't see a world in which the desktop doesn't fit in the market at all. In particular now we're able to do things like implement the headless Chrome and actually in terms of the features that you can offer we can now offer in the desktop software, its parity if not better than all of the crowd cloud the crown cloud sorry products anyway.

MC: I think I'm biased anyway probably because I'm a bit old, so I like desktop software. I use both and actually have had some of the problems you've said. so I mean on maybe some of the larger sites what I'll do is, I'll portion out and only a cross-section of that site to various cloud providers, so I'm looking for kind of platform wide issues - I'm just taking a slice of different pages and they can say, okay I potentially got an issue here and then that's what alerts me to maybe use a tool like Sitebulb, to actually then go and crawl the whole site and see what the depth of that issue is and again, just do it instantly on-demand and I've got the data there. So yeah completely understand that and like you say it as well with lots of smaller sites there the cost doesn't always work in terms of the bigger platform.

PH: Yeah I mean, the bigger guys you see a lot of the time, they end up aiming for enterprise because they can get the big contracts, they can give account management and priority support and all that sort of stuff, and that's the sort of thing that we can't offer, we don't want to offer - we want to offer a self-service product that's accessible to everybody. so all the real enterprise stuff, we leave to one side and go, we're not interested in that and yeah, that's an area where the desktop product doesn't fit at all.

MC: So one of my favourite features in Sitebulb, I think you added it fairly recently, is this internal link equity flow and I’m pleased you didn't call it internal link juice flow, I don't think that would’ve helped anyone, but do you want to just tell us a little bit about that feature and why you think it's important and why you added that in?

PH: Yeah, I think it was just before lockdown actually we released two changes to the way we're doing the link analysis stuff. One was bringing in this metric to measure the value of each page in terms of internal links, and the second thing we introduced was the link Explorer. The reason these things are important is because internal link optimisation is a really powerful technique and it's also one that you completely control. So the website owner is in complete control of internal links because you can't decide which external sites link to you, unless you're doing sort of PBNs and that sort of stuff, but in general you can't decide where these external links come from, you can't decide which pages they link to, which anchor texts they use, but you can control all of this stuff with internal links.

So internal links themselves provide an opportunity to boost certain pages up or to consolidate link signals or even try to affect how search engines understand what a particular page is about. Even though it's a really good opportunity, the problem with it is that it's not straightforward to do, and it's not always that easy to see, for instance, if you've got say like five pages that are all linked to using similar anchor text - so when we built the link Explorer earlier this year, we were trying to make this process easier, so it allows you to explore and filter the link relationships on the site. So you can like dig into anchor text and investigate how well the site is optimised from an internal linking perspective and theoretically, you could do all this sort of stuff in Excel - just sort of a dump of all the in linked relationships, but it doesn't scale very well and doing the actual exploration is a real pain in the ass, and so that was the problem we try to solve with all of the internal link stuff we added. Does that answer your question? I don't know if it does.

MC: Yeah it does, I just had my mic muted because my dog was barking. So yeah I really like internal link optimisation, I think gets missed on a lot of sites and again, if you check out the search notes if you're listening, we will put a link to an Authoritas webinar that we took part in where we actually gave a little bit of a demo on how we used Sitebulb to help with internal link optimisation - it's a really, really nice easy win.

So you added that just before lockdown and I found out of this because of your release notes, which is something I want to talk to you about Patrick. Your, I think it's fair to say infamous now, for your release notes, which I personally find highly entertaining. I've seen on Twitter some people don't share that opinion. So if you don't know what I'm talking about here, it's worth just subscribing to Sitebulb newsletters to get the release notes because Patrick's quite colorful with his language and descriptions of what's been happening. So I just want to know why you choose to do this because I understand nowadays brands are a bit more accessible online, a bit more friendly, touchy-feely, but you know you're straight out there dropping f-bombs in release notes. Is this kind of an intentional like, let's make something different, or were you just like look I don't care, I'm just gonna be me and this is what this is.

PH: I guess it's a bit of both like I can't be credited with the idea it was Gareth's idea to make them more than just release notes. So I've been writing release notes for years with URL profiler and we'd write the description of what it is and why it's important and the biggest thing that we noticed is that most people don't read anything, and so when we started off in beta - so again, this is sort of mid 2017 - we were doing sort of beta release notes, and I would write them and explain what we'd added and why and all this stuff and I showed them to Gareth, oh here, what do you think about this, and he literally just said they're a bit boring aren’t they, can't you just jazz them up a bit? So I just thought alright let's try. So I started to try to create them into like each one, I don't do it with every single like thing that we do, but I will try and make sure that each one has got various stories in there, or like pop-culture references, or like you say swears, jokes - often I can find either we can be self-depreciating and sort of make a joke of something that we've screwed up, or I can like ingest attack our customers and sort of blame them for their foolishness that sort of thing.

So I started off doing this and learnt that it's a lot easier to do by the way for bugs, it's a lot easier if a customer points out a bug and we have to fix a bug. It's harder to make a joke out of trying to explain a new feature, but really the core idea behind the whole thing was, you can look at almost every single product across the world and look at their release notes and they will be the same sort of formula, and a lot of the times they'll just say things like, bug fixes and improvements and that’ll be it, and you're expected to upgrade to use this thing. I mean, for a start we want people to actually understand what it is we've changed, what we've added, so we want people to give people a reason to read them and secondly, we don't think that's a transparent way to communicate. We think it's important that people understand that we've fixed a particular bug that might be affecting all these people on some websites like this. So we think it's actually important that people do read them and understand them and understand what's happening, and this was kind of a way to get people to do that, like if we put other jokes in there and make them a bit swear, a bit more fun, then maybe people will read them, even if they're not necessarily reading to find out what the features are, but they read them because they're fun, they will as a side to that, find out what the features are, so we get to win.

In terms of the swearing, I very much believe that people should use all the language, they shouldn't restrict themselves to a small subset of the language which doesn't include swears, and I think they have that.

MC: Hang on a minute… this just sounds a little bit of episode 3 of Star Wars now and he's telling you he needs to look at all the aspects of the force.

Star wars clip: if one is to understand the great mystery, one must study all its aspects not just the dogmatic narrow view of the Jedi

PH: Exactly, I actually think I stole that off Tim Minchin. I've since tried to find a link or a song - wherever it whatever it's from, where he actually says, I like to use all the language, because he's very sweary as well and I have not been able to find it since but that is basically the sort of the thesis behind it, if we get to use all the language including the swears then they have their place.

When I was doing this in the beta, so we would email out these release notes and now I would have various and I got a reply from someone saying, you shouldn't be emailing me with an f-bomb in it and I was like, fair enough and then sort of took a step back and was like, maybe that's unprofessional, maybe I shouldn't do it and so for the next set I said, unfortunately someone complained so I'm not going to do it and then I got a ton of complaints that I haven't sworn. So then, I was like well I have to put them back in so I just reversed that decision and started swearing again and we are where we are now. So now I try to make sure that they are entertaining even if they aren't always sweary, they don't always have them in. But there's all sorts of Easter eggs and stuff in there which I put in, and I'm sure most people don't even notice and then every so often someone will reply, or will email me or message something about it and they'll mention one of my Easter eggs and I'll be like “yes! you actually spotted them”, I spend ages putting these things in and half the time, they don't notice them.

MC: Well I think from that point of view you can look at it as a success, because if I reflect back, I think Sitebulb is probably one of the only bits of software I use where I do read the update release patch notes, I can't actually think of another bit of software I actually do that for. So yeah I guess that's a really good reason to win and keep doing it.

So I have a double-barreled question here for you which I mean would be really interesting to get your insight, because obviously you're having lots of conversations with SEOs as you're building this tool and I'm wondering what your insight is here to do with audits because a couple of people we've had on the podcast, such as Aleyda, we've talked about audits, we've always mused about how our feelings were several years ago that there might be less of a need for technical SEO because we thought, hey Google's gonna get kind of smarter and like you said, you know with the HTML it will just kind of round off the kind of rough corners and just understand things. But the trend we've actually found is that we're having to do more technical SEO, what with various JavaScript frameworks now and client server-side rendering. So I'm interested, do you see any sort of most common issues that people are talking to you about, in terms of auditing in terms of technical SEO? and you know what your thoughts on how technical SEO is going to change over the next few years? Do you think we have kind of peaked almost and it's going to get less complicated, or is it we're just gonna march onwards?

PH: In general I think there's a lot of the core things are still just as important as they ever were. like making sure that the right pages are being crawled and indexed, making sure that page is properly targeted for keywords, all of that stuff it's still important and it's still like this the most common thing that people are still doing. What I think is changing and it's going to change further is the breadth that we need to know about is increasing, like it's no longer, if it was ever what okay, but it's no longer okay to just report on like 404, or 301s and a few images missing alt text, which you know like could have easily have constituted an audit you know a couple of years ago. Now we, as SEOs need to be aware of performance mobile usability, hreflang schema, all sorts of various different things. A lot of that stuff even you know five years ago, most SEOs weren't taking those things into account and I think as the technology increases as we go more and more mobile, we will be required to know this stuff.

I think the thing is, it's not going to be for every client, and it's not going to be for every website you look at you need to know this stuff. But if you're in an agency for instance, then there are going to be more and more clients that come along where they are using a JavaScript framework, and you do need to think about things like server-side rendering, you do need to understand how rendering affects what is getting indexed. Onely are doing tons of brilliant research into this area, if anyone cares about technical SEO they should be following everything that Onely are putting out there because that is one of the ways that technical SEO is going and you are going to be required to know how this stuff works. even if it's not that you'd need to do it. I've never been one of these believers that you need to be a coder. I have never ever felt that SEO should learn how to build websites.

I think what you need to do as an SEO is understand and be the bridge between a developer and the client and I think that's the role of the SEO, and I think that's going to become more important. As the technology gets more complicated the client gets further removed from understanding what it means, like it's relatively straightforward to explain to a client 404s, 301's all those sorts of things, but it's much more difficult to explain to them rendering or performance issues and I think that's really where where the the SEO who sits in the middle is going to have to up skill and make sure they do understand this stuff, and I think that's where it's going. I think technical SEO is going to get more important. I can't see how the things that we've seen in the last couple of years suddenly go away and there are other things that are going to, again, become more important. Again, like the semantic understanding is clearly on the up, so there's going to be a greater importance upon structured data, data consistency and those sorts of things where the search engines are moving towards understanding what the data means and how can we give them better signals to gain that understanding.

MC: I think it's a really interesting answer, especially when you're talking about the debate which I have seen a lot of times, about if technical SEO is you know what level of skill they should have in terms of development, and when we've covered off training, you know it's similar to you, I think being able to understand what needs to be done and why it needs to be done is the useful amount for an SEO. So you know about canonical tags, I know what they are and I know why they need to be changed and I can explain it to the client, but then you have a developer that knows how, but as long as you understand the first two things, you should be able to communicate it with them what exactly needs doing and it's their job.

I've found the same personally, so I've got a bit of a background in development, but when it's come to things nowadays around say performance, I've definitely had to pass that over to people who are way better than me at knowing all the back-end configurations that get better performance because you just can't know the how of everything now, in my opinion, with technical SEO, even if you are a developer.

PH: Yeah absolutely, and that's how our industry has developed I think. when it was sort of a fledgling industry, you could have one SEO that basically did the technical, did the on-page, did the keyword research, did the keyword targeting, did all the link building, and like they could handle all of that, and the more it develops as a specialty in a niche, it's just the more specialised we all get, so it's right that you shouldn't be implementing the canonical or doing the server improvements to speed it up, that's not what you should be doing. In my opinion, that's not what the SEO should be doing, they should be advising and then it should be helping the client to understand which recommendations will make a difference for them and where it is important that they focus their development budget. You can't just go, there's a whole load of stuff to do, it's all equally important. It's about figuring out well which ones are actually going to make a difference, or which ones should make a difference, and how easy is it for you or your development team to get them done, then we make the decision about what should be done first and help make sure it's implemented properly.

MC: Perfect. So Patrick, we're already almost that forty minutes, so let's round off with - what's next for Sitebulb? So we've talked about what it is, where you are positioning now, what's happening next with it?

PH: So we've got some things in the pipeline which I think a lot of folks have been waiting a long time for, and we're hoping to release them at the start August, all things being well - we've got schema validation and like Google search features validation and we're doing the validation a bit differently to how everybody else does it. We're validating against the docs themselves. So essentially with the Search Feature Validation, you have things like the rich results tool which will give completely different results to the structured data testing tool and which will give completely different results to what you see in a Google Search Console, in terms of what's actually reported, and then again what you see in the SERPs, all these sort of different sources of truth and then we've seen Google coming out saying, well we want you to build again, so we want you to validate against the docs themselves.

So that's exactly what we're doing, and this thing is basically built, it's about to go into testing and that is has been a massive job because again, this landscape is changing all the time, and so we keep having to change it and make sure we're up to date with everything, but also getting all of that validation in there was a big job and being able to give it to people in a way that makes sense for them, to actually make sense of the data, in our UI we've done a lot of work on and well hopefully tweaked during the beta process, but hopefully it's not too far from finished. So that's coming.

At the same time, we've got content extraction and again, we've looked to do this in a way that's easier to use and makes more sense than all other implementations out there. So when we release features that like are common occurrence across other tools, we try our best not to just do almost exactly the same as what they've got, we try to see what's been coming out to this, how can we make this thing better, how can we understand the user case better and do this in a way that makes it more usable and allows people to do the things which ultimately they're trying to do in an easier way. So there's some really exciting and interesting stuff coming on the content extraction.

Then equally we've done the same with the content search. So this is something which lots of other tools have had for a long time, where you essentially put a search term down and as you're crawling the site you pick out pages that have this thing on, and we've built that in and we've again tried to build it in a way that allows you to do a few more interesting things than simply looking for a word on a page. So there's those three things, there's a few couple of other bits and pieces which I'm not gonna mention yet because I don't know, for sure, if they're going to get released in this version, but again some more interesting things that people have been asking us for. So those three things, definitely on the horizon, and they are all basically built and about to go into beta.

MC: Wow, I can't wait to read the release notes about them.

PH: That's not been written yet.

MC: So if you do want to check out Sitebulb, go to a favourite search engine and type in Sitebulb and I am sure you'll find their website, and you've got a free trial at the moment?

PH: We've got a free trial all the time. So 14-day free trial that you can just go on the website and download it, and it's like completely fully featured, you don't put your credit card in or nonsense. You go on, you're getting essentially the paid pro version for the 14 days, so you get a proper chance to try it without any limitations of the trial, that sort of stuff. You actually get to try the full thing before you decide if you want to pay for it or not. And then we have monthly plans and yearly plans, so loads of options for what you want to do with it.

MC: Cool, Patrick thank you so much for taking the time to join us, to give us your thoughts about technical SEO and tell us a bit more about Sitebulb. Really appreciate it

PH: Thanks very much Mark.

MC: So we are going to be back on Monday the 13th of July with episode 69. In the meantime, if you did enjoy this please take a few seconds to subscribe and all that and I look forward to hearing from you next week.

More from the blog