Candour

SEO Testing with Ryan Jones

Or get it on:

Show notes

In this week's episode, Jack Chambers-Ward is joined by Ryan Jones, Marketing Manager at SEO Testing to discuss all things SEO testing including:

  • What are the different types of SEO tests?
  • What is A/B testing?
  • Should you test every change to your site?
  • How do you know if testing is right for you?#
  • How long you should run a test for
  • How to track results when testing

Follow Ryan

Transcript

Jack: Welcome to episode 77 of season two of the Search With Candour podcast. I am your host, Jack Chambers-Ward. Coming up on this week's show, I am talking to Ryan Jones, the Marketing Manager over at SEOTesting.com, all about SEO testing and different types of SEO testing. It's a very interesting conversation. You will learn a lot about SEO testing if you don't already know about A/B testing, time-based testing, and all the different things. I dive into a lot of detail with Ryan, and Ryan is a fantastic speaker and a fantastic guest. So look forward to that conversation coming up in a couple of minutes.

TrendWatch June 2023

Before I get to my conversation with Ryan, I'm going to have a little sneak peek, a little glimpse at SISTRIX's new TrendWatch. You know I love to talk about TrendWatch on the show if you've tuned in over the last 18 months. It's one of my favorite things to talk about because there were some really weird interesting data sometimes and you can also pull out some very interesting insights out of it as well. And of course, if you want to go and check out SISTRIX, you should go to sistrix.com/swc. You can go and check out their fantastic free tools such as their SERP Snippet Generator, the hreflang Validator. And if you want to check the Google updates and keep an eye on those, there is also the Google Update Radar there as well. And you can get TrendWatch by going to sistrix.com/trends. You can subscribe to the newsletter there and you will get 10 brand new trends delivered to your inbox every single month.

I'm going to pick one to talk about this week because it caught my eye because I was literally talking with my wife Emma about this yesterday. This is, is Temu legit? I don't even know if I'm pronouncing that word correctly. It's T-E-M-U. I'm assuming like Emu, Temu kind of thing. It was the most downloaded free app on the App Store and Google Play Store in the last quarter of 2022 is Temu, not TikTok, not YouTube, not Instagram. It's a 12-month-old, essentially, shopping app, and it is a platform for a lot of stuff that is drop shipping from Chinese factories and warehouses and stuff. I got big Ali Express vibes from it personally, and I think that's probably why is Temu legit? such a big search in the US recently, because it seems too good to be true. There seems to be quite a few complaints around it as well, sort of like undelivered packages, incorrect orders, terrible customer service, all that kind of stuff. It does seem to be as legit as any one of these other drop shipping international services that you see around working from Chinese warehouses and Chinese factories and things like that. It's gotten to a fair bit of hot water with the Better Business Bureau with more than 30 complaints and a current customer rating of about 1.3 stars, which is not good by the way. But the amount of stuff you can get on there, like I said, if you're familiar with AliExpress, it is an almost identical interface and a very similar product range. You can get all kinds of weird, cool, interesting stuff very, very cheap, a lot of home devices and applications and things like that.

I've recently personally gotten into retro handhelds and emulators for playing old video games and stuff like that. AliExpress and Temu are one of the best places to find these kinds of devices. Actually for want of a better phrase, actually get them from the legit sources as well despite it may not seem legit. But essentially what they're doing is subsidizing sales to be a loss leader in order to gain that market share and notoriety and stuff like that. They've even got loyalty programs so you can gather people to sign up and earn coins and stuff, again, like you can do on AliExpress. You can then use those coins to purchase things. So you don't even necessarily need to give them your card information, if you're able to share and be part of that loyalty scheme, you can earn enough coins to actually buy stuff without even paying for it.

So it's an interesting proposal. This is a really interesting case study for e-commerce stuff. I think it's something maybe not to replicate with your clients necessarily for looking for e-commerce, but something to think about and something to have a look at in terms of how the interface is working and how their approach to marketing and growing their brand has happened over the last 6 to 12 months or so. So like I said, that was a little glimpse, little tiny, little teaser for TrendWatch for this month. Is Temu legit? It seems kind of yes would be my answer. I think that's the consensus from the data journalism team over at SISTRIX as well. Like I said, go to sistrix.com/trends and sign up for the TrendWatch newsletter to get 10 fantastic brand new trends delivered to your inbox every single month.

SEO testing with Ryan Jones

My guest this week is one of the fastest rising stars in SEO, a BrightonSEO speaker who has over six years of digital marketing experience despite being about 10 years younger than me somehow. He is the Marketing Manager at SEOTesting, the one and only, Ryan Jones. Welcome to Search With Candour.

Ryan: Thank you, Jack. Thank you for having me on. It's been a long time coming.

Jack: It has, yeah. We've been chatting on Twitter for a while now. I know we've had some battles with some of the darker sides of the SEO world.

Ryan: Yeah, it's the dark side of history there.

Jack: Isn't it just? Isn't it just? Yeah, and we actually met in Brighton in April as well. It was lovely to finally meet you in person and realise just how tall and young and compared to the rest of the SEO community.

Ryan: Okay, yeah. It's always an interesting one when age is brought up purely. As an example, I was on a video call with Nick and Tiago who's also in our marketing team, and the topic came up because I purely just happened to mention that I was 23 and I turn 24 next month, and Nick said, "I thought you were in your early 30s." There's a second where I'm like, "I know that was meant as a compliment, like a genuine compliment."

Jack: Yeah, that's like a maturity thing, right?

Ryan: Yeah, and is part of it I thought. It's like, "I don't look 34, surely."

Jack: Well, I'm going to be 33 in a few months, so I know the feeling of being around 30. It's not all bad, but your knees and your back do start to go, mate. So especially as a tall person, I'm 6'2" and you're even taller than me, so get prepared for some dodgy knees 10 years time.

Ryan: I have those already. Well, 16 years of playing football will do that.

Jack: Oh. Yeah, you're off to a good start.

Ryan: Absolutely.

Jack: So some people may know you from your BrightonSEO SEO talk. Some people may know you from your work at Land of Rugs that you worked there for quite a few years working in-house, and I think one of the highlights of a lot of in-house stuff is being able to move around and experience different companies and stuff like that. And you've recently moved to SEOTesting, which is a company I was already familiar with and from what I understand you were already familiar with before, so you were already a customer of theirs before you joined the team. Do you want to talk about your little journey towards SEOTesting?

Ryan: Yeah, well, like you perfectly mentioned there, when I first joined Land of Rugs, one of the first pieces of, I was given a bit of budget to just use as I would freely want. And I think it was pretty much the first tool I suggested. I was still running at $19 a month for just one side. I was like just yes, for the price. This is absolutely what we need because I was the first pure SEO hire at Land of Rugs. They had a mark in executive four, but I was the first hire who was brought on who really specialised in SEO. So it was one of those, I'm going to sit down, make a whole load of changes to the site and I don't want to sit there tracking the results or just rely on a rank, and seeing this tool, SEO tested all, everyone's talking about it on Twitter. And I was like, yeah, we need to sign up. And then, I think the turning point was finding that they had a 60-day trial with, if you're a member of Traffic Think Tank, which I was. So it was like, yeah, I'm absolutely saddled up to this trial and then started using it. 60 days came around and I was like, yeah, absolutely, we'll just put our heads turned in right now.

Jack: Nice, nice. So how did you come around to getting to know Nick and actually becoming part of the team further down the line?

Ryan: Yeah, well, I mean like me and you started off, we, myself and Nick started to exchange little bits on Twitter. I reply into sort of SEO testing threads or just general SEO stuff and then I don't do so much anymore, but I used to do a bit of freelance work on the side and then just so happened that I'm quite content-based to writing content and optimizing it, all that kind of stuff. So yeah, Nick sent me just a message on Twitter, I think, "Just got some content work coming up, would you like to have a crack at it?" So did a few pieces of either new content or just editing, optimizing content across in maybe a period of six months, something like that. And then, it kind of happened that there was a message from Nick landed in my inbox just pretty much asking for advice on salary range for a job. So I was like, "Yeah, send the job spec over. I've dealt with enough hiring people into teams and I know enough recruiters and all that, I could probably give a good honest opinion." And so, had a look at this job spec and sent him a salary back and Nick was like, "Cool, do you want it?"

Jack: You should have just gone like, "Yeah, it's 100 grand, mate. Yeah, that's a reasonable salary. Yeah, yeah."

Ryan: Just really there, but-

Jack: "Add an extra zero."

Ryan: Yeah. But, no, and I think at that stage, it was pretty much a no-brainer for me to, because I got Land of Rugs to a stage where, not that it can run itself, but we had an apprentice who joined the team and he was pretty much at that stage where he was fully trained up. And the best thing about it, because I'm still in touch with the Land of Rugs team, is he genuinely has taken on my role. So that speaks to hopefully how well I've trained him, not to sound too egotistical, but hopefully it means I've done a good job with training him up. So he is just taking it and running with it and I am still keeping on the rankings and what the traffic looks like. So yeah, they're doing well and I'll speak to the team. They're all still super busy, which is really good.

Jack: Awesome. Awesome. So let's dive into this week's topic. We've touched on the fact that you work for a company called SEOTesting, and that is going to be the topic for this week. We're going to break down various different types of SEO tests, why you should be doing them, and basically how to harness the power of SEO tests for you, your website, your clients, whether you're in-house, whether you're agency side, there's opportunities basically across the board for some SEO testing. So let's kick off with what do we mean by SEO testing? We've said that phrase a few times already. People might not even know the actual concept of it. So let's start with that. Let's define SEO testing.

Ryan: Yeah, cool. Absolutely. So I mean you can kind of define it in two ways really. You've got the maybe classic example, which if you know a bit about SEO testing, you might already know is that you have a hypothesis of, oh, maybe if I change this meta description and include this list of keywords or speaking maybe an old slang, but like, oh, maybe we could improve the keyword density in this, that kind of thing, or there's definitely ways to get some new keywords into this blog post. Then, we can actually make a change to this page, we can test our hypothesis, and then we can track the results. And then, the other side of it is tracking the results of changes we were going to make to the site anyway. You'll have experience of this as an agency and when you take a new client on, you go through your audit and you have your list of recommended changes. The client signs off. You're like, yeah, cool, sit down, we'll go to the tech team or the content team, whoever needs to do certain aspects on the site and will make these changes. You were going to make those changes anyway because you know from experience that it's probably the best thing to do, but it just means that you have an opportunity to then track the impact of those changes straight off the bat. And then, when we come to think about it, we've always been doing SEO testing in a roundabout way. We've always been optimising sites and tracking the changes. It just so happens that the changes might have been like you'll just see the page on a rank tracker or something. You see the rankings go up or the organic traffic goes up to a page, but it just means that through all the tools that we have access to, now we've got more data to look at and tools to automate it as well, which is really good.

Jack: I think you're totally right that so many people start off, I think multiple guests on the show have talked about how they started off in SEO, is just experimenting and playing around with stuff. And you are totally right that I think so many of us do SEO testing without even thinking about it. That is automatically part of your flow and your natural processes when you are updating a page title or changing a thing or adding internal links, as you said, changing meta descriptions or whatever it is.

Tiny little changes like that seem like, oh yeah, that's a normal part of the process. But yeah, you are testing that stuff and you're totally right there where you first onboard a new client, you go through the tech audit, you go through the content audit, all that kind of stuff, and then you should be able to then track those results. That's key to proving the success and proving the work you're doing is going to work. I think that tracking the results is the big key there because if you're experimenting but not actually checking anything, then you're just throwing it into the wind and see what happens.

Ryan: And it helps with client retention as well, especially from an agency site when you can go into your monthly reporting session with a client and you say, "Right, here's a checklist of things that we've done." And then, you can show them another slide on site and here's the impact that it's had on your site. You can see the organic traffic going up, you can see that more people are on your page, and hopefully then that means more people are converting whatever the conversion is, whether it's buying a product or filling in a form and becoming a new lead. It's definitely going to help you in that sense as well. And even if you're in-house actually, that's the bulk of my experience, it helps you with those pain points that you've had. I think there's a bit of a joke running on Twitter that Nick is trying to advocate for is like an SEO is for life, not just for Christmas, because I think there is an ongoing thing of it is hard for SEO staff to show their value within an organization as an individual thing. Obviously, marketing teams are useful anyway, but if you're just, especially if you're in a team of one, it's really hard for you to prove your worth as a salaried employee. So it helps just link it back to that as well. You can say, "Right, here's what I've done and here's the good results that it has shown."

Jack: Cool. So there are a few different types of SEO tests we can do as well. We've mentioned a couple of tweaking a page title here, tweaking a meta description there. There are more involved and more complicated versions. So from what I understand, there's three main types. And again, I am with you, listeners, by the way. I am very new to SEO testing. I have basically done the, yeah, I've done it without even thinking kind of thing, but actually moving into this conscious process of thinking about it is fairly new for me as well. So hopefully it'll be a learning experience for me and a learning experience for you as well, listeners. So yeah, those three main types, you want to break those down for us, Ryan?

Ryan: Yeah. Well, I think probably the easiest one to break down just as a starting point is serial testing purely because it's probably quite rare that it's done regularly now. So serial testing is essentially changing pretty much the whole site or something massive on the site as a whole. If you are a news website, then you could be changing the way the pages are laid out or changing the structure of your site, maybe bringing in a new header and getting rid of certain subcategories or you're changing the theme of your website maybe, or maybe you're doing a migration, maybe you're going from Magento to Shopify or something like that and making a big change on your site. And then, you can then test the site as a whole. That's best looked at as a time-based approach as here's the change on the graph where we've made the change. And then hopefully, you see your results start to improve from there. But that's only really done when you're making a full site based change like a migration or something like that, or if you do-

Jack: That was something we were talking about before we started recording, where it's like serial testing seems to be the outdated/the old school way of doing it. Obviously, it's relevant for site migrations and things like that. Tracking through migrations is an incredibly important thing to keep track of all the redirects and the changes in the URLs and anything that can slip through the cracks and all that kind of thing. But I always think of, again, from my research looking into this, preparing for the episode and all that kind of thing, serial testing seems to be less common these days and the two other types we can talk about are the more common ones. Is that fair to say?

Ryan: Yeah, absolutely fair. And then, thinking back to what I just said of serial testing being like here's some big changes, hopefully you're not really making big changes that regularly, because if you are, then maybe something's going on there that you should probably look into.

Jack: Just change our top now for our full page layout twice a week, see what happens.

Ryan: Yeah. Well, we tried Shopify for a quarter and it didn't really pan out.

Jack: Yeah, exactly. A new CMS every quarter.

Ryan: Yeah, if you're doing serial testing, you're changing massive things on your site regularly, then maybe you should have a chat with your team. But there are two or three other types of testing that can be used more regularly as well.

Jack: Cool. So let's talk about time-based. You just touched on it there as a little preview of the next type. And from what I understand, that's basically changing a single page or a single element on a site and tracking that and see what happens, right?

Ryan: Yeah, pretty much, exactly that. You can group it as well. We have a tool within SEOTesting that allows you to do... It is called a group test as well, but essentially that is just a time-based change on a group page as rather than a single page. But essentially, what a time-based test is is you make a change to either a single page or a group page and then you track the results of that page for group pages over time. And a good example is a content refresh. So you've had a blog post on site that's been sat there for, let's say, it's three years old and it's kind of dropped down, maybe page four, page five, but the search traffic's still there. So you're right, we can sit down and we can add new sections in. We can better match search intent of as to how the search intent has evolved. So we'll sit down, we'll spend a few hours really looking at this, and we can republish it under a new date. So that would be a good example for a time-based change is if you make the change to that piece of content and then you track how that content changes over time in terms of ranking and organic traffic, that kind of thing.

Jack: Awesome. Yeah, I think that's something I do a lot. As we mentioned earlier, during that content audit process when you're going through the refreshers and thinking like, oh yeah, there's some missed opportunities here for these keywords you didn't even think about or these internal links to things we didn't consider and things like that. I find that's the one that comes the most obviously, the most naturally during the process, if that makes sense.

Ryan: Yeah, yeah, absolutely. A good example is I did it a lot with Land of Rugs with content gap analysis. So one of the big things when I started was there was a massive blog on the site already, but not many of the posts really were well-matched to the search intent, hence why the blog was actually a big point of optimisation for us. We could really sit down and do a lot of good work with a blog. So that was almost a weekly process for me: taking a blog post, optimizing it based on what we have missed in the search intent or different keywords that could have been added or different sections that we can talk about, and then track and see how those pages improved over time.

Jack: Awesome. And last of all, A/B testing or split testing as some people would call it as well, I think this is the one I have seen talked about the most recently. This seems to be the hottest topic in SEO testing at the moment. So from what I understand, this is making a change and essentially having a control and being able to track different changes across a group of changes essentially. So you have one thing you don't change or a group of things you don't change and a group of things you do change and you're tracking essentially A versus B and seeing whether it's positive or negative. Am I right in that assumption?

Ryan: Yeah, you got it right. And then, I think a lot of the popularity has come when you think of people like Will Critchlow who I really admire on stuff like that. He's really, really big on split testing because he has a tool that essentially does split testing for you. And then, I've heard that Kevin Indig, he started to drill more down into SEO testing, and then Logan Bryant as well is another one. He's a big advocate for testing as a whole, but he's mentioned split testing as well, and then obviously us over at SEOTesting as well. We like to make sure we're making content on that kind of thing as well.

But essentially A/B testing is testing a control group and a test group. So I think the thing to mention is it's really only suitable for certain kinds of sites, e-commerce perhaps being the biggest example when you think of the amount of category pages that they might have that. That's probably the biggest example of where you do an A/B test is one, you need to make sure you have enough pages that are the same essentially to have your control group and your test group. And you need to make sure that both your control and test group pages have enough clicks to go and see them as well. Hence, why split testing might not be the best for small websites if there's a whole load of pages that aren't getting a lot of clicks, you're not going to be able to track as much data. But if you've got a big site, like big e-commerce sites is an example, and you say we can really drill down and make a change to this category page to, I don't know, improve ranking or you can look at it from other ways as well. But you'd take a group of control pages which aren't going to be changed, and then you'll take a group of pages, you test, and then you'll make the changes just to that test group and then you'll track how the test group evolves compared to your control group.

So if you were to look at it on a graph, hopefully you'd see your control group line stay pretty stagnant and any test group should hopefully shoot up and show the improvement, and then that you can roll it out to your full site. So the best use case that I can think of for A/B testing is to sense check bigger changes that you want to do. So hopefully you make the change to a smaller group, hopefully see that the test is successful, and you start to rise in organic traffic and rankings or whatever, and they say, "Right, that's what we can do. We can make the change across the whole site now."

Jack: It's weird, I was talking to my wife yesterday about talking with you on the podcast today and the topic we're going to be talking about. My wife doesn't know about SEO apart from what I unfortunately forced upon her because she can't help but absorb SEO living with me, unfortunately for her. And I was talking about that kind of process of like you said, doing that smaller test on a group of pages before. If it succeeds and is a positive effect, then rolling it out into the rest of the site. And funnily enough, we've just bought a house and I've been cleaning carpets a lot recently, and I said, "So you know it says do the little spot check in a corner and test if it bleaches the carpet or ruins anything? It's like that, but for websites basically."

Ryan: Yeah, it pretty much is exactly that. You're just making sure. Because obviously if the test goes wrong, it's really handy that the negative results only impact a subsection of your pages.

Jack: Yeah, I think that's such a key part of it. If you go straight into the, let's just test this thing across the board and it has a big negative impact, you are shooting yourself on the foot and there's, as is always the case, as we all know here, there's no guarantee you're going to reclaim those clicks or those rankings that you're going to lose if you leave it running for a period of time. And then, there's no guarantee Google will come back round and crawl and reindex stuff and just be like, "Oh yeah, you're back to how you were before. Here you go, have all your rankings back."

Ryan: Yeah, because we know Google is very forgiving, obviously.

Jack: Yeah, totally. Yeah, of course.

Ryan: And it also links back to what we're doing if we're just testing hypotheses essentially. We can have all the data and the anecdotal evidence that we want, but at the end of the day, we are still guessing as to what's going to have an impact. So there is every chance that your hypothesis could be wrong.

Jack: And I think collecting data is the key here. This then allows you to make those data driven decisions, and especially if you're working in-house or even if you're working in an agency and you have to pitch a wider idea to a client of, I think if we make this change, it will have a positive impact. Let's do a little test so you then have a proof of concept. You have that data to be like, "See, we are now making a data driven, we're not just going on, Jack thinks this might work, fingers crossed, let's throw potentially hundreds of thousands of dollars at this thing."

Ryan: Yeah, that would always make for a very interesting client reporting session, I imagine if you've spent $100,000 or whatever and then the massive site-wide test has failed.

Jack: We made a guess and got it wrong, sorry. See you next month.

Ryan: Hopefully, you're as forgiving as Google...

Jack: So thinking about how we can implement it on our sites, across our client sites and things like that, should we be testing basically everything we do, or do you think they should be reserved for those bigger plans and bigger changes, or does that depend, is it case by case in that way as well?

Ryan: Yeah, I mean, ideally, yes, you want to be tracking the changes that you make on your site as a whole, and I think that's easier with tools like SEOTesting, SplitSignal or whatever you're using. The goal of these tools is essentially to make it easier to make change and then track the resource as well. And then obviously, it builds up a bit of a log of what you've been working on your site, which will help with client reporting. And it also helps with future tests that you might want to do as well because you can then filter back and say, "Well, I did this content optimisation, or I changed a meta description in this way and it led to these results." So linking back to what you said about being data driven, it just means you've got more and more data to fall back on.

Jack: You can never have too much data in SEO.

Ryan: Exactly, exactly that. I mean, there is a case for saying that maybe you don't need to do a test again if you've done it in almost the exact same way in the past, and it's led to positive results, but just because it works in the past doesn't mean it's going to work now, because as we know, the algorithms constantly changing and what worked six months ago might not have any impact whatsoever now.

Jack: Yeah. That's totally a thing from most of my career has been agency side and going from one client to another be like, well, that worked on the last Magento site, so it should work on this one, right? It's like, nope, that didn't. No, it's a totally different niche, totally different situation. You mentioned search intent earlier and how important that is. There could be completely different search intent for a seemingly fundamentally technically similar website can have a completely different intent and a completely different user experience. And one test does not necessarily carry over to, I've done everything for WordPress sites, so here's all the WordPress tests. I've done all the Shopify tests, here's the Shopify sites. That's not how that's going to work. It's not as simple as that unfortunately.

Ryan: Exactly. And then, in the same vein, the algorithm weights different things depending on your niche. If you're a finance blog, it's going to really, really give proper weight to your money or life, whereas if you're selling, I don't know, 20-pound pair of socks, yes, it's going to have some weight because you're still spending money, but it's not the same as advising you on the S&E 100 or whatever it is, like all these index funds. Maybe it's bad that I don't know the prop term.

Jack: Should we change it to your socks or your life now? That's the new term.

Ryan: Yeah, there we go.

Jack: YSYL.

Ryan: We've started. Yeah, that's going to be the next big hit on SEO Twitter app intel.

Jack: Definitely. Definitely. We'll make that hashtag going forward after this episode goes up.

Ryan: Yeah.

Jack: So I talked about having different clients, different sites, things like that. When's the right time to think about testing? How do you know if testing is worthy? You mentioned e-commerce being the perfect example for a big e-commerce site being a good example for A/B stuff. Should you always be testing every given opportunity, essentially, kind of what we're saying before like ideally yes? But when you are pitching that to a client, is it going to take extra time, take extra money? Are there certain situations where maybe you should be thinking about it further down the line or maybe thinking about it for a different client?

Ryan: It kind of links back to what was said a bit earlier in the sense that we do it all already, in the sense that at the end of the day we still make changes to a site and then track results, just whether you're doing proper "SEO testing or not" just maybe depends on how you're tracking it or what changes you're really making. But it's good for all kinds of sites to have SEO testing strategies in place, whether you're a small site that only gets a few hundred organic clicks per month or you're one of these massive multinational e-commerce sites. Everyone can benefit because you can track results in different ways as well. You don't just have to look at your tool and see that organic clicks are going in or to say that it's a successful test. Because if you look at GA4 and you can see that people are spending more time on your site or something like that, or they're clicking through to more pages, then you can say that's a successful test as well. If you've changed the search intent of a blog post, you can say that it's been a successful test if they no longer bounce back to Google, if they then click through onto a related blog post or an offer or something like that, where even if you don't have the tools to do it or you don't want the tools to do it, you can still make use of an SEO testing strategy, maybe just takes a little bit more time. But linking back to what you said about it might take more time to get the client to come around on it, so to speak, but like you said, we're being data-driven, it does help that you can link back to past examples. Obviously, you don't have to say the client's name or whatever, but you can say that here is a test that we did on an e-commerce site that sells something that's similar to what you did.

Jack: Yeah, I have those kinds of conversations with clients all the time, where it's like, "Oh, we're thinking about doing this thing." Literally, yesterday I had a call with a client talking about moving over and basically updating their WooCommerce side of things. They're not e-commerce, but adding WooCommerce into their existing WordPress. And I was like, "Oh, I have two other clients that also use WooCommerce. We have done this, this, and this. And this worked and maybe we could try this but don't do that kind of thing." Like you said, you naturally have that conversation if you've already got the experience with one side of it, you can then just, again, not necessarily do the same thing on every single site. That is not how that works. I'm not endorsing that, but having that baseline guide of like, oh, these are the kind of things we can change. These are the kinds of things we want to test. And coming up with my next question, you mentioned a couple of things there already, organic clicks, click through rates, stuff like that. What are the kind of metrics we want to track when we are making these changes? And again, I guess, does that change case by case as well depending on the client, depending on their intent?

Ryan: Yeah, it is. I'm quite pleased that we've gotten basically half an hour into the podcast and I haven't really used “it depends” on, so yeah, that's... But yeah, in this case, it really does depend. It depends on the goal of the page for one is the goal of the page to get a user to buy something, whereas the goal of the page just to build brand awareness and help move a customer or a potential customer from the very top of the funnel to just a little bit further down. So if that's your goal and you just want to move someone further down the funnel, then maybe you're concentrating less on your organic traffic or your organic rankings, but you're focusing more on page metrics like time on page and things like that. And that's where tracking those kinds of metrics can be helpful.

Or if you've put a lot of money into a big, really top of the funnel brand awareness piece and you're like, "Right, we just want to get as many eyes on this as possible." That's then the use case for tracking your organic traffic and your rankings and all that kind of thing is because you want as many people seeing that piece of content as possible. And then for e-commerce sites, if you're testing your product page or your category page, the end goal is always to add revenue into it. So the end goal is always to get users to buy a product. So you might focus less on organic... Obviously, organic traffic helps because if more people are seeing the page, then it might mean more people buy, but the end goal is to get whatever number of users are on the site to buy. So you might focus on conversion rate or something like that. So it totally depends on the business. It totally depends on the kind of page that you're testing. It totally depends on your business goals as well.

Jack: And how should you go about tracking this? I'm guessing because you work for a company called SEOTesting, you guys do that, that's built into the tool. You are able to track clicks and track things from there and connect to three. Obviously, we all have access to Google Search console and plenty of different rank trackers from SISTRIX to Semrush to Ahrefs and all that kind of stuff, but that's kind of what you guys do at SEOTesting. That is the power of having a tool that brings it all together.

Ryan: Yeah, yeah, absolutely. I mean, in SEOTesting's case, we take our data directly from Search Console's API. So whenever you run a test, then you can track how that impact is, organic traffic, and average rankings, and the number of impressions as well. And you can do that on a query level as well so you can see if impressions, clicks have risen for certain queries as well, which is really useful. But then, you've got tools like SplitSignal and search part as well, which do their own separate things and there's different use cases for those tools as well. And then, linking back to a simple, maybe the old school way is what maybe we used to do 10 years ago is we'd make changes to our website and just track the rankings. So there's different ways it can be done. And obviously, maybe you want to steer clear of just changing a page. And then seeing that two keywords have gone up 10 positions. That's all well and good, it doesn't give you that granularity that you need. Or you can maybe move on to just manually looking at your data in search console or analytics or something like that, but that's more of a hassle anyway, one, because you have to export the data in search console and look at it in Google Sheets or Excel or whatever you have to do. And we all know the hassle that everyone's been having at the minute with GA4.

Jack: Oh, God. Yeah.

Ryan: The less I can look at GA4 the better in my opinion. I know Ryan Levander will probably have some sort of aneurysm if he hears me say that. So maybe I'll send it to him just again to really critique me on my lack of GA4 knowledge because I just simply refuse to look at it. And then, obviously from our point of view, and I'm sure Will Critchlow has the same point of view as well, we think that the best solution is to look at a tool like SEOTesting or SearchPilot or SplitSignal or whatever.

Jack: I appreciate you doing the BBC style, “We are not officially endorsed by SEOTesting.” You are allowed to promote your own stuff on here, mate. That is allowed. Don't worry, we are not-

Ryan: Yeah, of course.

Jack: I appreciate you bringing the neutrality.

Ryan: Yeah, obviously we want as many people there trailing SEOTesting and hopefully become customers as possible, but at the end of the day, our tool might not be the best tool for a certain business to use, and maybe it helps if they know that SearchPilot and SplitSignal exists and all that kind thing. But if you're interested in the trial in SEOTesting, then hit me up on Twitter.

Jack: There we go. That's the little call-to-action we needed right there.

Ryan: That's my piece of advertising.

Jack: You got the approval from Nick there.

Ryan: Yeah, yeah, absolutely.

Jack: Yeah, it was interesting talking about the metrics side of things. Something Mark and I talked about on the show a few months ago, I think it was at the end of last year, Chris Green did an amazing thing called the SEO Metric Chain. I don't know if you saw this, Ryan, where he laid out-

Ryan: No, I didn't.

Jack: I'll send you a link. I'll put a link in the show notes as well, listeners. Got to search.withcandour.co.uk. Fantastic little, basically a little diagram of how you describe this to clients, how to understand the journey from growing your visibility and getting rankings that then leads to impressions, that then leads to clicks, that then leads to sessions, that then leads to transaction, that then listen to revenue, that kind of full user journey for want a better phrase of how you grow a site. Like you said there, oh cool, you're up 10 positions for two keywords, does that mean you've got clicks? Does that mean you've actually got conversions? And they're not the same thing.

They don't necessarily correlate to, just because you're on page one for a keyword doesn't necessarily mean you're going to get those clicks. And even once they click, it doesn't even necessarily mean they're going to purchase something if that is the goal. And I thought Chris's way of laying that out and showing it in a diagram of that is the natural flow of things. If you're doing everything right and all that kind of thing, there's a perfect way to test different parts of it. You can see the ranking is going up, but maybe the clicks aren't going up, or the clicks are going up, but the purchases aren't going up. Perfect opportunity there to get a little test in there and see maybe we can change the conversion rate. Maybe we can look at clickthrough rate. There are so many opportunities throughout that journey to do experiments and test different things and we've got loads of rankings but no clicks or loads of clicks, but no purchases. That's that perfect opportunity there to dig around and do some experiments and do some testing.

Ryan: Yeah, absolutely. I love that way of looking at it as well. So yeah, send me that, I'm definitely going to, I'll be interested in having a look at that because I think it links back to as well, you've probably seen the same kind of blog posts that were published in the last few years of your organic ranking doesn't matter anymore.

Jack: Yeah.

Ryan: And that's all well and good saying that, but at the end of the day, it does still matter to a certain extent.

Jack: That's such a simplified way of looking at it, right?

Ryan: Yeah, exactly. I haven't read one of those blog posts in a while, but I think Matt Barby had a post like that on the HubSpot blog, which gained a bit of traction. But it does still matter to a certain extent because at the end of the day, if you're not ranking at the top of the search for the keywords that you want to be ranking for, then like you said, it's not going to lead to impression, it's not going to lead clicks, it's not going to lead to revenue which is the end goal. I mean, we can probably spend a whole new podcast talking about SGE and Google Perspectives-

Jack: Oh, God. Yeah.

Ryan: AI. Let's just not go down that rabbit hole.

Jack: Last week's episode, I touched on with Garrett Sussman, we talked about a lot of the SGE stuff through perspectives and how that is changing the SERPs and how much Google is changing the SERPs and you're totally right. Even doing something like that, I have been advocating for this for a long time, and it's something I touched on with Garrett a lot when I was on Rankable and when he was on Search With Candour was diversifying your content strategy. And I think there's opportunities for testing there as well like turning your blog post into video content, or because I'm a podcaster, do an audio version of it and have the little audio description thing at the top there. And either you yourself as the writer or somebody in your team or whatever it is, reading out that and testing that and see if that affects thing, adding things like accessibility and things like that. There's so many opportunities. You can go in so many different... We could be talking for hours here, it feels like, to test every little different option to think about all the different ways to do it. But as the SERPs are changing so much, as Google is bringing in more generative experiences and perspectives and all that kind of stuff, it feels like being able to do experiments and test stuff and see, okay, I feel like people were doing this a couple of years ago with Google Discover. That was when Discover was the hot thing. And people like Lily Ray I've seen get literally millions of clicks through Discover. And I'm just like, that is so impressive. But I find Discover to be such a fickle, constantly shifting thing of like, oh, we got a few thousand clicks this month, and then nothing for a couple of months after that. And it's this constant battle when it comes to Discover. But there's totally reason for doing loads of testing as all these new features, as all the SERPs are changing, more advocation for doing more testing, right?

Ryan: Yeah, yeah, definitely. I do like what you just said, touching back on a little bit of maybe adding different accessibility points or linking to an audio file and stuff like that, because I do think last point on SGE, I promise, but I do think that it is going to change the way and that we report on different results and things like that. And I do think SEOs are going to have to be less egotistical in the future. Because at the end of the day, it is going to impact how our organic ranking appears in search results and all that kind of thing. So we need to really redefine our role and how we measure success and that kind of thing. And at the end of the day, success is revenue for business. It's not organic ranking, it's not clicks. If SEOTesting got 20 million clicks to the website over the next two or so months, obviously that's amazing. But if it's not then leading to people signing off trials and then becoming paid customers, then it's pointless. So we as SEOs do a test and we say, "Oh, maybe there's a use case for video here or a podcast here." And it then in a roundabout way leads to more revenue, then that's success. It doesn't have to just be clicks and rankings on the chart.

Jack: Yeah, absolutely. I think that's where you get so many of those conveniently missing an axis kind of graphs on social media. It's like, "Hey, look, this has gone up by 100X." It's like, is that from 1 to 100? Because that's not impressive. If you're going from 100,000 to 10 million, okay, now I'm impressed. Now let's talk, let's have a conversation. But oh, interesting, the Y axis is missing on that graph, how convenient. We can't actually check the real data.

Ryan: The amount of mental restraint it takes for me to not ask about revenue in those kinds of posts. Even if they do put the Y axis on and they say, look, I took this website from 1,000 clicks to a 100,000 clicks. I'm like, "Amazing. But what about the revenue for the business?"

Jack: Yeah, yeah, yeah. What's your conversion rate like? Has it basically stayed the same and you've just got a tiny, tiny conversion rate now?

Ryan: Yeah, you're running at naught point, naught 1%.

Jack: Yeah, exactly, exactly. So pretty much wrap us up, let's last talk about how long we should be doing these tests for. Because we kind of talked about how to do them, what we should be tracking, but we also talked about algorithm changes. We also talked about the shifting of the SERPs and all that kind of stuff. I would hope basically everybody listening knows not to do stuff overnight and just roll it back straight away and have a momentary panic or a momentary celebration and then jump the gun essentially. Is there a ballpark range of how long we should be doing these things for to actually get reasonable data and results from them?

Ryan: Yeah, the short answer is a minimum of two weeks, but ideally aim for six to eight weeks. And the six to eight figure is going to give you enough data to gloss over any seasonal impact that you might be hitting at a certain stage of making a change. But you want to aim for a minimum of measuring for two weeks, but it does depend on what your goals are.

Jack: That's another “it depends”. That's two mate, that's two.

Ryan: Yeah, there you go. Because at the end of the day, if you do roll something out and then you immediately start to see a massive decline, then do you really want to sit there and potentially watch that happen? You shouldn't be scared of little changes if something goes down really steadily for a short amount of time, then don't be afraid of that. But if something massively drops after making a change, then you probably maybe want to revert back. But as a general rule, we like to run our tests for anywhere from six to eight weeks just to make sure we gather enough data and we can be really confident that it is our change that's had the positive or negative impact, and we know what we can do then from a next test point of view. But I suppose one caveat to make is during a Google update is don't change your site. Everybody should know.

Jack: Keep an eye on volatility of the SERPs.

Ryan: Yeah. And that's just rule one is basically if you're in the middle of a Google Core update, for instance, and they haven't yet told you that it's complete, then yeah, don't change your site because you could then see the results go up, but you can't say for sure that that's the meta description change that you did. That could be just Google algorithm changing. So yeah, always leave your site alone during a core update.

Jack: Yeah, there's no control when there's a core update happening.

Ryan: Yeah. Obviously, once the core updates rolled out and you can, like you mentioned Lily Ray earlier, she always does amazing analysis on the impact of core updates and what kind of sites have been hit and what's seen success as well. So obviously, once the core updates rolled out, that is then your time to analyse what's happened and then the changes that you can make and test to get yourself back up there if you've been hit negatively.

Jack: Awesome. Well, I feel like we've kind of covered a lot, but we're also just scratching the surface on a lot of SEO testing stuff. So listen out there, if you do want to go and check it out, go to SEOTesting.com. Go and check out what Nick, Ryan, and the team are doing over there. It's a really cool thing to try out. It's a really cool thing to experiment with. Like I said, we are not sponsored by them. I just wanted to have a genuine conversation and I appreciate your candour. Get it? There we go. There we go. There's the corporate synergy right there. But I really appreciate you coming on, man. I appreciate you being open and honest and discussing it with me, and it's really nice to actually finally sit down and have a conversation with you as well. It's a long time coming.

Ryan: Yeah, absolutely. It's been, like you said on Twitter earlier, it's a really good way to end the week. I know I've still got a full afternoon of work too, but it is a good way to sit down on a Friday and just have a good chat about SEO testing and how it can help businesses of all sizes.

Jack: Absolutely. So people do want to get in touch with you, like you said, if they want to try out SEOTesting and stuff like that, where can they find you across social media?

Ryan: Yep. You can find me mainly on Twitter, I'll be honest. So it's RyanJonesSEO on Twitter. You can also find me on LinkedIn as well. I've got this horrible AI-generated profile picture at the minute that you can probably identify me. Yeah, I'm 100% more active on Twitter. And then obviously, if any listeners do want to sign up to SEOTesting, there's a 14-day free trial, no credit card required. So just head to SEOTesting.com and sign up for a couple of weeks of testing and trying out at all.

Jack: Awesome. Awesome. Well links for everything, listeners, will be in the show notes at search.withcandour.co.uk. Like I said, go and check out SEOTesting.com. Go and follow Ryan, because Ryan is going places. I described him as one of the fastest rising stars in SEO for a reason at the start of the show. So please do go follow Ryan and keep track of all the amazing work he's doing over there.

Ryan: I appreciate that.

Jack: Thank you, Ryan, for joining me. It's been an absolute pleasure, mate. Thank you so much.

Ryan: Yeah, thank you for having me.

Jack: And that wraps us up for this week's episode of Search With Candour. Thank you, Ryan Jones, for joining me. It was an absolute pleasure. If you haven't already, please do go and say happy birthday to Ryan. It was his birthday on Friday when this comes out on Monday, so his birthday was the 7th of July. Go and say happy birthday to Ryan. Ryan is awesome. Thank you for joining me, like I said, Ryan. I hope you enjoyed and learned a lot about SEO testing, listeners, and I'll be back next week with the host of Democratizing SEO's Talk with SEOs, Austine Esezobor. We'll be talking all about the search generative experience and the future of SERPs. It's a big conversation, it's a wide topic, and Austine and I get into some really, really interesting conversations, and I think you'll really enjoy it as well. So please do stay tuned for that. And until then, have a lovely week.