Candour

Episode 102: Learning SEO via social media with Daniel Foley Carter

Play this episode:

Or get it on:

What's in this episode?

In this episode, you will hear Mark Williams-Cook joined by Daniel Foley Carter from Assertive to discuss learning SEO via social. Together they'll cover:

  • The dangers of learning SEO via social media posts

  • Common SEO myths and misconceptions that are repeated online

  • How business owners can educate themselves about SEO to make better hiring decisions

  • Can 'SEO tests' realistically be done by a single person?

Show notes

Episode 52 - SEO myths with Natalie Mott

https://withcandour.co.uk/blog/episode-52-seo-myths-and-misconceptions-with-natalie-mott

Ben Fisher tweet

https://twitter.com/TheSocialDude/status/1370152564633694214?utm_source=tldrmarketing.com&utm_medium=referral

SE Roundtable article

https://www.seroundtable.com/google-site-score-ranking-individual-pages-30829.html

Transcription MC: Welcome to episode 102 of the Search with Candour podcast recorded on Friday, the 12th of March 2021. My name is Mark Williams-Cook, and today I'm going to be joined by Daniel Foley Carter from Assertive. Daniel is a really experienced SEO that's quite popular on LinkedIn and the perfect person to talk to us about the dangers of learning SEO from social media. So when you see those big mike drop posts on places like LinkedIn saying, "This technique works, this doesn't," what do you need to be careful about and where can you go as a business to get good SEO advice online?

Before we kick-off, I want to let you know, this podcast is very kindly sponsored by Sitebulb. I'm going to talk about Sitebulb as if you haven't heard it because I really hope you have. Sitebulb is an SEO auditing tool, runs on a desktop on Mac or windows. I've been using it for a few years now, and we use it in our agency at Candour for all kinds of technical SEO projects and it's absolutely brilliant.

I'm always surprised when people haven't heard of it because it's my first port of call, genuinely even before I speak to a client to run them through Sitebulb. You've probably all used various SEO tools and crawlers before, in my opinion, what sets Sitebulb apart, apart from a lot of the pre-checks it does when you start up, and this tool is actually the depth that goes into in the context of the issues and explaining them. That means rather than just have a crawl with a load of data, Sitebulb really goes the extra mile with identifying the issues, giving you clear explanations of what they are, how generally you would approach fixing them and even prioritizing them, or at least giving you the first step in terms of SEO at least how you'd prioritise them. So obviously your job as SEO is to tie in the rest of the business there and work out where those priorities might fit in the bigger picture. But Sitebulb, in my opinion, is one of the best places to start with this.

They've got a special deal for Search with Candour listeners. If you go to sitebulb.com/swc, that's sitebulb.com/swc, you can get a 60-day extended trial with them for free. No credit card or anything required. So do give it a go.

As I said today, we are joined by Daniel Foley Carter, who is the director at Assertive. He has 21 years plus SEO experience. He's worked for some of the largest U.K. agencies delivering SEO for the likes of Sky, Virgin Holidays, Quantasis. He's run an agency himself for 13 years and in the last two years has been an independent SEO consultant also. I know him mainly through LinkedIn. I've seen some excellent SEO posts by him, some equally entertaining SEO rants. It's clear to me he knows what he's doing. I've seen you challenge bad advice, and I'm really pleased, excited to have him on this episode with us. So Daniel, welcome.

DC: Hi, thank you for inviting me onto the podcast. Really appreciate it.

MC: Yeah, no problem. Again, this is one of those episodes, as I did with Andrew Cox Stanley a few months ago, which is very much off the bat. So I'm in a... One of my marketing Slack groups has got Daniel in and I was looking through the SEO news for this week and some stuff has happened, but it's really... I'd be waffling if I tried to make a whole episode out of it and I'm always very against that. There was a suggestion that maybe I should talk to Daniel on the podcast because as many of you who've seen how I use LinkedIn from an SEO social point of view is I've always been an advocate of challenging what I perceive to be potentially damaging advice, and I've seen Daniel certainly do the same as well. I think we agree on most things, right?

DC: Yeah. Yeah. [crosstalk]

MC: It's fair to say, We certainly had an interesting conversation before this episode. We were meant to have a couple of minutes just to chat about what topics we might like to talk about and it instantly spiralled into a 15-minute debate about bounce rates. So I'm going to try and shield you maybe from that during this episode, and I've got a few topics I'd like to talk to Daniel about. Before I kick-off, I think just to frame all of this, through my personal experience with the SEO community, with people that are interested in SEO, personally, I use Twitter a lot, as many of you know and LinkedIn. Dan, do you use Twitter? I don't think I've really seen you on there.

DC: So I used to use it a couple of years ago, but I think I just tried to focus on owning one platform. I felt with Twitter that the 150 character limit at the time just meant that I couldn't really push out the creative thoughts that I had. Even though I know that Twitter has now obviously extended that, I never really got back into it. I always maintain that I think LinkedIn was a good place to be primarily because a lot of the people that own businesses or in charge of marketing or decision-makers are generally on LinkedIn and more likely to catch a more detailed view of the opinions that I have to share. So I just primarily try to own LinkedIn as a channel. I do have Facebook as well, not so hot on that. But I think with Twitter, I have actually recently set up a profile and I was considering again, just stringing out useful tips and tricks whenever I could, just so I do have a better market coverage.

MC: I think that's really interesting. It echoes the experience I've had, which for me, I obviously started doing the whole unsolicited SEO tips on LinkedIn, but I don't do them on Twitter. My experience has been a lot of the professional community for whatever reason has chosen Twitter as its home, and it seems fairly well apart from it can obviously get a little bit, I'm just going to say, toxic every now and again, it seems fairly good at self-policing itself in terms of people will challenge each other and talk about ideas and generally, you get a fairly sensible outcome. When I was a little bit more active on LinkedIn, I was a little bit aghast sometimes at the information I saw posted there about SEO that was getting rightly or wrongly because it was getting engagement.

The engagement doesn't necessarily mean something is correct, which we've all learned about with this whole conversation around fake news. So I think that's why I started these tips, which was, I originally started trying to challenge these people, and then I found that there was... Essentially, as soon as you cut one head off, there were two more things to face. I was like, "This isn't a good use of time." So my approach was, "Okay, well, I'm just going to start putting out what I think is good advice every day and see where that goes." So what do you think are the challenges that companies face when they're trying to get SEO knowledge from a platform like LinkedIn?

DC: So I think that this actually applies further than LinkedIn. If we look at LinkedIn as just one social platform where people interact and can share information, I think the problem as a whole of the industry is the lack of there being an educational standard means that everyone learning SEO can have their own view or opinion on something. So if someone decides that in their SEO career, that they're going to take a certain path of how they learn, maybe they'll read blogs, they'll watch videos. Because there isn't an industry standard or regulation on any of the information, people can absorb what they've read and then they can regurgitate that even if they don't have anything to back what they're saying. In a lot of cases, people just want to be helpful. So they will share tips that they think are valid. You get some people that obviously want attention, so they will try and share something that might be a little bit more, should I say, controversial?

But the biggest problem is that if someone says something and they're arbitrary in how they say it, and if they've got a following and people willing to learn from them, that's when it becomes a problem because people absorb bad information or information that isn't justified and that's carried and spread. It just makes it more difficult for people genuinely looking for genuine information to actually get that.

I think this problem has been compounding, especially since COVID, as it has fueled such a demand for SEO, now more than ever with people relearning, people want to evolve their career, SEO is one of those places that is a growth industry. So now people are really keen to learn, people are really keen to uptake and absorb as much information as they can. LinkedIn is just one of those places where I've seen it, and I think you've seen it on many occasions where we see people making posts, they're not justified in any way, but the way that they put the context, can actually be damaging to people.

MC: Yeah, I'd agree with that. One thing I like about SEO is that a lot of the things that I think we talk about, the concepts are difficult to prove as facts and I think that leads into a very long discussion, which I think is probably best show for another time around regulation and industry standards and who would decide what is correct there.

I think you'll agree that conversation and challenging each other is really good and healthy within the SEO community. There should exist that debate between, "I've observed this, and I think this, and I think it might work like this." I think the danger comes when rather than approaching it that way, like you say, when people package it up, especially in cases to make it more controversial, they will just make a statement about something and present it as fact rather than say, "Well, actually I think it works like this because of this," and then have that discussion just saying, "You can do this and you will rank."

DC: Yeah, yeah. A hundred percent.

MC: On industry-standard regulation, I dipped my toe into that while we're talking about LinkedIn, because it's tried to get me to do an SEO exam before. What are your thoughts on those attempts?

DC: Well, I think it's a very grey area because, in terms of having an industry standard, it would be very hard to create the standard and then for it to be policed, because as we know, it's very easy for anyone to label themselves. You can go on the internet and get a certificate to say that you've gone through a Google Ad certification. So there is the fact of actually having something to separate. Like anyone training or studying law in the U.K. will have to study to the bar society or the bar standards. Because there isn't really any of that in the SEO industry, I think people have to make a real decision about whether or not an SEO or an agency or consultant is good based upon them being able to prove that concept.

In terms of exams and things like that, I think that if there was a central point, especially from Google sites, even because Google advocate SEO, as we know from some of the posts that they've put out. For people to actually be able to go in and prove that they've gone for the certification, one thing that they could do is Google another body could actually have a directory of people that are known and recorded to have gone through and done all the practising. Even on an interval to go on a refresh up because as you and I know, over time, certain things might change. The SEO industry is not just about training to a standard, is also being able to keep up to speed. So do enough SEOs maintain their own knowledge or do they get so caught up in just looking after projects that they don't keep an eye on what's changing?

MC: Yeah, absolutely. So with this situation as it is, without that regulation or certification that's universally agreed on, what advice would you give business owners who need SEO about how they should go about educating themselves and picking a partner? Because like you, I know you get leads as I do through LinkedIn, right? And that's just based on people seeing I've written stuff about SEO and they've come to the conclusion, "Hey, well, Mark sounds like he knows, or Dan sounds like he knows what I'm talking about. So I messaged him and asked SEO."

I imagine the people, like we say, who in our opinion, maybe aren't giving what we think is necessarily good SEO advice, I'm sure they're getting those people as well. These business owners won't have time to run their own tests and they're probably not that interested in SEO. So what advice would you give them on how they can dodge these bullets and maybe find people that are going to be helpful for them?

DC: So I think given that most businesses need SEO, it's not even isolated. It's now spread across pretty much every industry. The fact is that because SEO is such a fundamental part of any business's growth, generation of leads, generation of product sales because it's so intrinsic to a business, there has to be internal protocol. Even if the people making the decisions are very busy, there has to be a lot more due diligence in picking someone. Now, if someone doesn't understand SEO, it's a dangerous place to let them make the decision about who should execute it because as we all know, it's very easy for an agency or a consultant or someone practising SEO to create a proposal, put complex words in that they don't understand, but all these snapshots in and people make their decision because they think, "Well, this looks really complex. These people must know what they're doing." Not realising that actually, they might very well not be qualified to do it. The information might not be correct.

So the decision-making process for any business in selecting an SEO is extremely difficult because it relies on that decision-maker to have some fundamental understanding. So some of the things that I would always ask if I was in a position where I needed a business to come in and deliver SEO, would be to ask whoever was going to do SEO for some case studies. I would always go and speak to some of the clients that the consultant or the agency was doing work for, and I want that to be verifiable and I would probably not settle for anything less than three to five clients. They need to understand what work was delivered, what was the budget scope to make sure that there's not an unrealistic gap.

Then, just to look at other reputational factors. So generally, if someone is really passionate about SEO, they'll have some form of standing. I'm not saying that that's mandatory, but generally, if people really love SEO, they'll be posting on social media, they'll be making their own concerted effort on their site. If you find an SEO agency that makes no real effort on its own SEO and doesn't have good reviews or good standing, those can be warning factors. I mean, it's not arbitrary, but general consensus. So the first point of call would always be speaking and getting references like you would with employment, checking and vetting these people.

MC: I think that's a really interesting point because, in terms of clients, I know that have come to us, I can see from our analytics that lots of people will look at the case studies we provide. I've always been a little bit on the fence about case studies because everyone obviously puts their best results forward and it gives the agency an opportunity to cherry-pick what they're looking at, which happens to various extents and everyone I think is aware of that. It's a barbe web. But probably only one in 20 times, I would say, do I get a client actively ask to speak to our existing clients. Now I'm sure some maybe try and sniff them out and approach them themselves, but it's actually quite rare that I've had clients and it actually happened.

I think I spoke about it a couple of weeks ago. We had a client say, "Well, we'd like a conversation with three of your current clients," and that happened. The clients phoned me and said, "They contacted me," which was great. I do think that's excellent advice. So if you are looking for an SEO agency or freelancer, I think definitely speaking to a range of their clients is helpful. As Daniel said there as well, more than one or two because everyone's got one or two friends that they've helped that will always say nice things. So if you can pick or suggest you speak to that can be sometimes a good approach.

DC: Yeah. It's also important to remember as well that if you're going through the due diligence phase, you need to actually look at the scope of the business that you're given as a reference as well. If you're getting given a mobile number and a Gmail address, that's not necessarily something that I would be overly comfortable with. If I was given a business, I'd want to see the domain and email on the domain, just the little things. It's very, very important because again, the SEO landscape now is so horrifically competitive. I mean, the agency landscape was booming in the late 2000s. Again, since COVID, it's absolutely exploded. There are agencies popping up every five minutes now, and everyone does the usual showcasing.

So it's very, very difficult for business owners to differentiate. If an agency looks like it's got a nice website, flashy things on the site, that can actually detract from, does the agency genuinely have that true internal expertise? Are there people in that business that are seasoned, professional, know what they're doing, know how to deal with an account? Unfortunately, I find that in this industry, the case is that good SEOs are very hard to find. So if that's the model and there are 10,000 agencies out there, then you can bet that a good portion of them are probably not going to have the level of expertise that a lot of businesses genuinely do need.

MC: The other piece of advice before we move on, I think for business owners is normally they will be when we're outside of COVID, a local normally SEO meetup you can attend. So we set one up here in Norwich and the goal of that was essentially to allow businesses to educate themselves about SEO in a non-sales environment. So we had a couple of speakers come on from all over the country, had some really great SEOs and they do a short talk explaining either case studies or certain tactics they were using or strategic level talks and it at least arms, as you say, the businesses with some base level of knowledge that they know what questions to ask and what answers they should be expecting.

Previously, on this podcast, we had Natalie Mott on almost 50 episodes ago. It was episode 52. We covered popular or common, I should say, popular SEO myths and misconceptions and we talked about the things that keep cropping up about meta keywords, or will I rank higher in Google if I pay for ads? That kind of thing. Are there any particular damaging myths that you see on LinkedIn a lot, that you see repeatedly shared and repeatedly applauded that make you think, "Oh, no. This isn't the right way to be going."?

DC: Well, I think my answers to this could go so far that the podcast would go far beyond any recording scope. In terms of SEO myths out there, I wouldn't even know where to begin to start, the number of myths that I see. When I talk about myths, I'm not talking about things that are just factually incorrect. I think the problem as well is perception. So what I'll try and do is break it down really simply, right? So I could conduct a study on a keyword for a website and I could test factors. Okay? Someone repeating that exact same test on a different keyword could get a completely different outcome, like completely. So if we talk about that, I can provide examples of sites that I've ranked with no link building, zero link building. So on that consensus alone, I could turn around with a very narrow view and say, "Do you know what? Links don't matter, It's all a myth that link building is king."

But then I could do it the other way round, in another industry that is far more reliant on link building, do all of the on-page SEO, the core web vitals, schema, all of the other things and get nothing without links. So if there was a myth in its totality, it would be that there is an algorithm per se. When I say that, what I mean is a single algorithm. So this thing that people, when they talk about it, they talk about Google algorithm like it's this set of rules that is applied to every site. Now I would say that that is a myth because we know there are hundreds of instant connecting ranking factors applied in all different proportions in a way that we could never intrinsically piece together. All we can ever hope to do is test and find what works for our objectives. Because I always see when people talk about myths, they'll make statements like, "Keywords don't matter." Or, "You should put your keywords in your alt text."

When we talk about a myth, it goes across the usual thing. You don't need to have your keyword in your title tag anymore because Google pattern matches or links don't matter anymore. Or if you've got a slow site, you're immediately going to lose ground on the core web vital up there. There are so many different things that I could constitute a myth. I mean, one that was thrown around for many, many years was that where you hosted your website had an impact on your rankings and logistically, that makes no sense. If you're in the U.K. and you have a .co.uk domain and you had American hosting, why on earth are you going to be penalised for hosting outside the U.K.? If your hosting is fast, that's a lot more important than where it's hosted. But for a long time, I'd see it, when clients came to me with an SEO report from another agency or client, and on there, it would say, "Your IP comes back as an American host. You need to change it. You need a geographically relevant IP to your target location." That to me is a myth.

MC: I think it's really interesting you've mentioned there about, and just to expand on the point to make it clear from the way I explain it as well. So I agree with you about these ranking factors. If we think of them, say, as levers, that if I pull one certain lever, what's actually happening is it's moving three or four other levers as well. So it's not just like you say, "Oh, we need to focus on this one factor." The starting positions, if you like, of those levers are differently configured, dependent on the type of site you're working on. So the most obvious example to me is, for instance, if we look at news websites. So it's incredibly obvious as a user that news should be new. You will see that when you work on news websites that Google ranks stuff almost chronologically because it knows that's important. Whereas if you have a website, for instance, that was archiving scientific papers, it does not make sense how new or how fresh that content is, plays a big part in how well it ranks because that's not relevant to the user.

That's just one obvious example of again... because that's something I used to see very regularly, which is the content needs to be fresh. Obviously, if you take that out of context, it's when you get people just pumping out blog posts with no real purpose, just because they think they need new stuff on their site. That's different from a strategy maybe that has value, which you say, "Okay, well, if we go back and maybe update something we've done, so it is better, that's going to help." That makes a lot of sense to me. So it was really interesting you've said that and you've touched on as well some specific things there where you said like the no links thing I think is always really interesting because that's a common one I see on LinkedIn, which is like, "I just did this with content and you don't need to worry about links."

But then if you actually go and do an analysis on those sites normally, you'll find anyway, they've got 10, 20, 50, 100,000 links already. So again, it's that part of the puzzle. It's like, if you have no links and as you said, you repeated exactly the same thing, right? You're not going to get the same result. So yeah, those conclusions, I think, are really interesting and I think this is just everything to a hammer looks like a nail kind of thing. If you're selling links, then links are to the most important things. If you're selling content, you're going to tell people that's the most important thing.

DC: That's it. That's exactly it. I think the thing is that the one thing that really changed the way that I perceived SEO was the fact that I would never make an assumption that one factor was more important than another, because honestly, the rabbit hole gets so, so deep when you then start talking about some of the experiments that I've witnessed where you'll get someone to deploy two sites, exactly the same thing, very similar content. One site will rank well, one site just won't. So when you factor in things like domain-level factors, you could take a site, right? That's on a new domain that doesn't have any history and you could push that with just content, not worry about links, do really good things with really well-written content, apply all the common things to content like passage indexing, proper internal linking, using structured data. If you've got another site that replicated that, but that site has got history and perhaps, maybe 10 years ago it was a repurposed domain, or maybe it spent 10 years accruing links that are fundamentally different to the theme of what the site now is, all of these things interconnect. What people tend to do is they carry forward an opinion of whether something is important or not without taking into consideration so many other factors. And that's where a lot of SEO myths, you can break them down and shoot them down. Someone says to me, and you've seen it on LinkedIn, obviously, I'm not going to mention any names, but in the last week, I have seen a couple of posts saying clearly, "You do not need links. Here are examples of sites we've ranked with no links, right?"

But that's dangerous because they haven't qualified the niche or they haven't said, "Well, you know what? This niche actually is research-driven. Therefore, the need for links is going to be less than perhaps an e-commerce niche." So these things are not just myths in some instances, but just factually incorrect. This is why if people are scrolling through LinkedIn and they see a post and maybe this is an SEO that's in an agency or even in a ton of SEO in a business, right? They're going through LinkedIn, they see this post. Okay? James bond has said, "Oh, links aren't important. Here's how we ranked with no links." Okay? That person might think, "We know for the last six months we've been building links, we haven't got anywhere. Maybe this guy's right." But they ignore all the other factors. So from a myth perspective, that's the other reason why so many myths are allowed to brew and propagate in the SEO industry because people can make decisions without having a full dataset.

MC: So I think this perfectly segues into my question now around testing. Okay? So Jono Alderson from Yoast has been quite vocal in his thoughts that you can't do SEO tests. So in his opinion, as far as I understand it, and if Jono, if you're listening and I've misphrased you or this isn't quite what you believe, feel free to correct me. But I believe what Jono is saying is because as SEOs, we are seeing such a tiny, insignificant sliver of the web, essentially, whatever test we do is going to be completely without any way to draw a conclusion from it because it's just such a speck of sand on the entire beach.

However, I know as SEOs, we love testing things and we love poking Google and seeing where we can find holes. I'm a believer in, I think, both of those points of view at once, which is that maybe it's important to test and you can learn things from that, but at the same time, you can't maybe draw conclusions from those tests. What's your thought? How do you balance that active, "Okay, well, I've got a hypothesis, so I'm going to make it a theory by running a test to see if I can support it or not."? Can individuals do that, or is that now the domain of people, maybe like SISTRIX who have got back to data to do that analysis?

DC: So this is a very contentious topic and it's also a very interesting one. So if I start the conversation with a question and I say, Google's own engineers are broken up. So Google's engineering department is fragmented, each specific area of the algorithm is dealt with by a different team. So Google themselves, internally between teams do not know how factors interact. Obviously, we've got machine learning, we've got AI and we've got all these other models that just complicate things even more. So if Google has all these interconnecting factors applied differently to different sites, different keywords, it makes sense that trying to run a control test theoretically, should be nearly impossible. Right?

If you then factor in that you don't have enough time to collect enough data about each micro factor that they could potentially be weighing at, because by the time you start collecting the data, Google has either done a minor index refresh, or it's done a core or broad core update. At that point, your data gets all completely muddied because Google will change something and all of a sudden the data you've collected is probably not good anymore.

However, if we were to also on the other end of the spectrum sand ay, "Well, you know what? There are too many variables we can't test." Well, I think that would be downright irresponsible. I say that because you can run tests and if you do them correctly and you have a pragmatic approach to it, you can get data that can influence decisions. So you could, for example, test between one core update and another core update.

You could identify the cause of say, cannibalisation, test the change and monitor whether that resolves the situation. Now, if we do a control test - what that generally means is if I've got a website with a hundred pages on it, and I've got a page that I want to rank for a specific query, rather than doing what conventional SEOs would do and saying, "Well, you know what? Let's change the title tag. Let's change the meta tag. Let's change the header tag. Let's jam in 5,000 words of new content. Let's build links." They've got all these things going on. It's impossible for them to know what is and isn't working. So what you do is you take your a hundred pages, you'd pick out 10 or 20 pages, and then you would apply one thing to each any one page, and you'd use a benchmark tool like SEO testing. Okay? You would try to collect evidence on something that might work.

Fair enough, you're not always going to get it right and you'll probably start a couple of experiments, and the next week Google will do a core update and you'll either win or lose. So applying a testing model is something that we should do anyway. Like good SEOs, we'll need to test. Otherwise, what else have we got? Do we read the guide that offers us vague information about the value of a header tag or the value of a link? We can't go by that. SEOs have to test. If they don't test, then they're just going to apply their own model to something and fair enough, some might get it right. Some might get it lucky and some might get it wrong, but personally, some of the best results I've seen from all of the successful campaigns that I've worked on, have always had good testing regimes.

MC: I think that's a really nice note to end on. I almost wish you had ended on some get it right, some get lucky, some get it wrong.

Daniel, we've already exceeded half an hour and we've felt like we've barely scratched the surface. I really thank you for your thoughts and opinions on this. You can find Daniel if you're on LinkedIn. Just search for Daniel Foley Carter.

I will put a link to Daniel and to Assertive in show notes, which you can find at search.withcandour.co.uk. That's all for this episode, I will be back in one week's time, which will be Monday the 22nd of March. As usual, if you are enjoying the podcast, please share it with a friend, subscribe, do all those lovely things and I hope you have a brilliant week.

More from the blog