Candour

Episode 107: Live LinkedIn SEO Q&A with Alex Holliman

Play this episode:

Or get it on:

What's in this episode?

In this episode, you will hear Mark Williams-Cook joined by Alex Holliman from Climbing Trees, taking a live LinkedIn SEO Q&A session taking questions on:

  • Getting a career in SEO

  • Duplicate content issues

  • Internationalisation

  • Buying links

  • No-click SERPs

  • Faceted navigation

  • Pagespeed on popular platforms such as Wix and Shopfiy

Show notes

Alex's Twitter

[https://twitter.com/alexholliman10] (https://twitter.com/alexholliman10)

Transcription

MC: Welcome to Episode 107 of the Search with Candour podcast, recorded on Tuesday, the 6th of April 2021. My name is Mark Williams-Cook, and I'm on holiday, which is why this was recorded on the 6th of April. What I will be bringing you today is a LinkedIn Live session I did the week before last with Alex Holliman from Climbing Trees, a very accomplished agency leader and SEO. We took to LinkedIn for just over half an hour, I think it's about 40 minutes, and answered your questions on SEO live. We got a really great spectrum of SEO questions. We had a couple asked ahead of time, but mainly we're answering these off the bat. I was happy about how it went, so I've taken that video, got the audio for you here, and wanted to share it because I think there's a lot of value in there, especially from Alex's answers as well, to give you a different viewpoint that you might not have heard before.

It's likely going to be something that we're going to do on a regular basis, so if you want to join in on a kind of Q&A live SEO session, find me on LinkedIn. Just search for Mark Williams-Cook, and you will find me. And you will get alerts, then, when we are going to be live. I imagine we're going to start doing it every Friday at the end of the month if you want to join in.

Before we kick-off, I want to tell you this episode is kindly sponsored by Sitebulb. Sitebulb, if you don't know, is a desktop-based SEO all-in tool for Windows and Mac. I've used it for years. It's an absolutely brilliant bit of software.

We were talking last week about QA and SEO and kind of checking things regularly to make sure they're still working rather than waiting for a big problem to occur and then trying to get to the bottom of what's caused it. This is actually possible with Sitebulb, which a lot of desktop based tools don't have this ability, but you can actually set up scheduled crawls with Sitebulb.

So, this means if you've got a little dedicated machine somewhere, for instance, or if your machine's on at the end of the day, you can schedule regular crawls, whether it's daily or weekly. And actually, in our project view in Sitebulb, that project's normally associated with a single website, you can then see the subsequent crawls. And in that audit you do, that crawl that you've completed, it will show you the difference in issues, whether they've gone up or down. So, it's a really nice way to very easily keep track of issues before... well, keep track of the if you have issues rather than waiting for them to kind of crop up and cause you problems.

It's something that's really nice. Because it's desktop-based software, you're not limited by a number of URLs or anything like that, which sometimes cloud systems can get quite expensive. So, that's one ability of many you've got with Sitebulb.

If you haven't tried it yet, if you go to Sitebulb.com, you can get a free trial. But if you go to Sitebulb.com/swc, for Search with Candor, SWC, you'll get an extended 60-day trial, no credit card or anything required. So, give it a go. Apart from that, I hope you enjoy this live Q&A.

We'll start off with some of the questions that were submitted before. So, I did say if you submitted questions before the live stream, we're more likely to answer them basically because if we don't know the answer, we can kind of just Google it, and look up, and then tell you and pretend we knew all along. But we'll start with some easy ones. So, one of the first questions we actually had was, "What are the first steps going into SEO, any recommendations?" And a similar question, I've grouped them together, "Any recommendations on starting a successful career in SEO?" So, yeah, Alex, what are your thoughts on this?

AH: Well, I think it's important to try and stand out when you're starting a career. And so there's a number of Skillshop exams that you can look at for Google Analytics and Google Ads. And foundationally, that will give you some sort of introduction to the world. And I think as a candidate, that will allow you to sort of stand out.

And I think that beyond that, it doesn't take that much to register a URL, set up a WordPress website, and then start demonstrating or learning the kind of stuff that you want on your own site, so that could be on-site optimisation. It could be making sure the site's technically okay, optimising for speed, and then obviously getting your ranking for a few terms, and driving in some links. As a sort of case study, as a start, and to go to an employer and say, "I'm looking to get into this industry," I think that's foundationally quite a good place to begin.

MC: So, yeah, the other advice I would give, if you are looking to start a career in SEO, it's, I'll cheat and give you this resource, done by a very nice chap called Carl Hendy. He's been in the kind of SEO sphere for a very long time now.

He's actually written an article which is entitled something like "How to Get a Job at an SEO Agency With No Experience in SEO." And he gives you a really good breakdown on the things you can do from home with your own site, you can learn for free, and the kind of things you are likely to be asked in an interview, and what some acceptable and actually non-acceptable answers are. So, it gives you some quite funny examples of the things he's heard in interviews.

So, let's jump on to the next question. I'll just check clients. So, brilliant. Thank you for dropping the questions in the comments. We'll come back to them. And the next question we've got is, "On an e-commerce site, a few products are in multiple categories with different URLs. Should I keep them in all?"

So, this is, I think I see this happen a lot with Shopify sites, especially. So, Shopify organises products into what are called collections, which other sites only call categories. And what happens in Shopify, then, is normally if you have, say, let's take an example, say, you have, I know, say you're selling a wooden chair, for instance, and you had kitchen furniture as a category and, I don't know, outdoor furniture as a category. And you put this chair in both. What will happen is you'll get two URLs. One will be, like, /kitchen furniture, /wooden chair, and the other will be /outdoor furniture, /wooden chair.

So, you've got the same product on two, or three, or more different URLs. By default, what Shopify will do is use a canonical tag to try and fix this, which kind of helps. So, a canonical tag will pick - you pick what's the primary category, and it will try and say to search engines, "Just rank this URL." It's not ideal because canonical tags are hints, not directives, which means search engines can ignore them.

So, if you get lots of links to the wrong category, that one will actually start ranking. The best thing you can do with a platform like Shopify and actually things like Magento will do off the bat when you first install them is they'll have product URLs that are agnostic of the category they're in, which will mean rather than have something like yoursite.com/livingroomfurniture/chair, it will just have / and then the product name.

So, I think that's a quite good approach to take because then you only have one URL for each product. You don't have to worry about duplicate content, or canonicalisation, or anything like that. I think, personally, that outweighs whatever microscopic benefit you would have of having keywords in the URL. For me, that's not really a thing. I don't know what your thoughts are on that, Alex. So, is there's anything you want to add?

AH: I think, yeah, Shopify takes care of that out of the box. And sometimes there's a valid reason for the same product being in more than one category. And so, you can usually sort of work it all out from there.

MC: Perfect. So, here's the hot topic at the moment, which is, what's the next step from the no-click punch that Google has landed? And then I put these two questions together, both from Harry: "Should rich snippets, such formats be led into or avoided to keep unpaid traffic up?" I'll let you dive into that one first.

AH: A hospital pass. So, I almost think it's fair enough if Google wants to disintermediate website owners. I think they can pretty much do what they want, and they do. So, I think you can question the motives of why Google would choose to do that. And I think if you were getting traffic prior to there being a no-click result and then suddenly a no-click result gets shown, you can appreciate why you'd be gutted with that.

I think if you look at Google's motives, want to question why they'll do it, in some cases, they probably feel that they can do a job better for the user in terms of some of the no-click results they'll demonstrate. In other cases, it's quite evident they will just get up to some... I don't know how to say it. But you could say duplicitous, like, some quite dodgy sort of things, where they're basically just taking other people's content and then serving that in the box.

And so, as a site, I mean, there's not much you can do with that. I think that, as was shown with the CSS thing with Google Ads, there can be action taken to Google to sort of limit their power, but that's usually quite slow before anything ends of particularly changing for you as a website owner. So, the whole thing with rich snippets, I think that is absolutely a potential solution. I think then you can look at potentially longer-tail searches, things like the people auto-ask questions, which... or focus on the snippets. And the whole argument then moves down to you're not going to be getting as many clicks. But you can then start looking at the volume of impressions your content is being served on within the search.

MC: Yeah. I mean, I've been following this. And it is just full broken down in an argument in the SEO community around this similar web data that was published by Spark Torres saying that the website owners are losing clicks. And Google obviously replied with their blog post, saying, "Well, actually we're sending more clicks than ever," as a raw number, of course; they don't give us the percentages.

I mean, my view on this is that this change is user-led in that users are finding it more useful to have no-click results. So, I don't think it's going away. And I think if you fight it, you're trying to swim upstream. And, actually, from an SEO point of view, I think it's we shouldn't get painted into this corner of necessary, like, a performance marketing channel.

So, there are bags of research that shows that there are great brand benefits to you being visible at the top of Google when people are searching for stuff. That's all going in your brand equity part, if you like. And this is going to just become more apparent when typing stuff into Google and looking at a set of results seems archaic. And it will seem archaic at one point. It will seem like how you used to go to the library, look up a book number, and crawl around your hands and knees trying to find it, when people in a few years are just kind of like shouting at their TV for an answer. And they're getting the answer straight back.

I think there's a definite benefit to being the brand that's providing those answers. So, yeah, I mean, my direct answer is I think definitely you should lead into rich snippets and formats because that's what's going to happen. So, your choices are to do it and have a slice of the benefits that come from that. And you might want to look at how you're measuring the impact of your SEO as well because if you're only measuring the performance side, you're not necessarily showing the value, again, from these types of results as well.

Cool. Should we jump in and see if we can take a live question as well? Dale Davies: "I used to own a Gibson, it's now an Epiphone because I only bought it to play with Rocksmith. I don't play as much as I used to. So, I couldn't... didn't want to spend 800 quid on something to plug into a computer again." "How to go about international..." Just having a look through these questions here. So, here it goes. So, Tom is asking, "What's the best SEO tool right now? Is SEMrush outdated?" SEMrush IPOs, was it yesterday?

AH: Yesterday, yeah.

MC: Yeah. So, big news for them. I don't think there's one answer to this, Tom. Don't know if Alex will agree with me. So, we use SEMrush. But I think you are setting yourself up for a fool if you use any one platform. So, I spoke to someone this week, where I saw on one of these platforms, I won't say which one, that their web traffic had gone, apparently their organic web traffic had gone from five million visitors a month down to, like, three, three and a half million.

So, I just dropped them a message, being like, "Oh, hey. What's going on here? Is this right?" And their response was basically, "No, that's completely wrong. Our traffic's absolutely fine." And this just comes down to how this particular platform was estimating traffic. And I've seen great examples where Ahrefs and SEMrush will give the same keyword a very high difficulty score and others who will give it a very low one. So, I think the value that a lot of these tools are adding is around forecasting, and estimation, and viewing data that we don't... or viewing estimates of the data we don't have hard data of, in that there is a lot of room to go off piece.

So, even when we do backlink audits, I love Majestic. I've used it for years, but I do use Ahrefs and other link data, like Moz now and combine them all because no one tool is perfect. If I had to choose one SEO tool, I mean, that I use the most in terms of regularity, nowadays it's probably something like Sitebulb, which is actually a desktop-based auditing tool. And it's just fantastic in the insights it gives and especially for people learning SEO because it gives a lot of the recommendations with explanations as well. And it's just an incredibly good, useful tool.

In terms of the big platforms, I would say you've got to really use more than one if you're doing it seriously, so make sure you're not... to hedge your bets.

AH: Absolutely. We use SEMrush as an agency across all our work. And that is quite foundational. Absolutely love Sitebulb. And I think it's sort of taken over its place in our affections that Screaming Frog once had. And a feel really guilty about it because I love Screaming Frog. The other things that we're using a lot of, there's the system LinkResearchTools. So, for some of our larger link-building clients, we have a sort of enterprise agreement with them. And that probably brings in data that is similar to Ahrefs and Majestic but is maybe a little bit more forensic in terms of its analysis. And so, that's absolutely transformative on that side of the business over the last couple of years.

MC: Good stuff.

AH: And he wears Noreen suit.

MC: He does, doesn't he? He's always easy to spot. So, from a LinkedIn user, I don't know why it comes up as a LinkedIn user, but it does. So, to our anonymous LinkedIn user, "How to go about international SEO apart from language optimisation? How is it different from national SEO?" Do you want me to kick off on that? Or do you want to have first stab, Alex?

AH: Well, I think, so, the international projects that we've worked on, we would do the piece within Google Search Console, where we have a separate site map per territory, and then set it up as a separate property within Search Console. And that, then, allows you to start seeing the relative performance by territory. And you can inform Google of some quite useful information in terms of what you were targeting.

I think in terms of if it's not the translation piece, we've had some e-commerce clients that have gone through the internationalisation process. And they have, when going from, say, the UK to the German market or out in the Middle East, they will then have a local customer service representative as well so that they can then fulfil live chat and that kind of thing because I think the customer service element internationally is critical to gaining some sort of traction and momentum. There was something else I was going to say, but it's left me.

MC: So, in terms of international SEO, there's some technical stuff around hreflang tags. So, you can specify to Google and other search engines which pages are targeted at which regions and which languages. Now, this is particularly helpful when you're covering very different regions with the same language.

So, the example I'd give here is the UK versus US versus Australia, all speak English, vastly different cultures. And, obviously, you need to show one price in dollars, one price in dollarydoos, and another price in pound sterling. And you need Google to understand the difference between those pages. Without hreflang tags, sometimes Google or quite commonly Google gets mixed up, showing maybe a US page instead of a UK one, or vice versa. So, there's that technical hreflang stuff that happens in the background that users don't see.

And in terms of you've grouped it as language optimisation, I say there are two elements to that: There are language translations, which obviously you want to do, and there's localisation, which is, again, US/UK is a good example. The way that people in the US buy and the language they're used to being sold to is very different to in the UK. You only need to, if you ever go to the US, watch TV there to realise how different the answer. So, there's a lot of value in actually localising.

And the last point I'd add is, again, a technical one, which is to do with Core Web Vitals and site performance. So, in your Google Search Console, you have your feedback on how your site performance is in terms of Core Web Vitals, which is based on the Chrome User Experience Report, which is the aggregated data from real users. And as Alex said, if you're splitting out your search console between the locations, that's important because if you are serving a site to a specific country, for instance, or region that has slow internet, your same site may score red on Core Web Vitals. Where in another country, it's scoring green, which means there's actually international technical optimisation you can do. So you can say, "Well, in this region, the average internet connection is, like, 3G. So, we need a lighter version of our site to deliver in that area to maintain the experience." So, there are three other things that I'd look at for international. Was there another thing you wanted to add there, Alex? I saw you, like-

AH: No, no. I was totally agreeing. I think it's good.

MC: Okay. Cool. Let's jump to one of our questions we got before we kicked off. So, here we go. Page speed for closed platforms like Squarespace and Shopify, is there a way to make them faster? You can tell us that one, Alex. Is there a way to make them faster?

AH: Yeah. So, I think you could, before selecting a theme or a template, you can obviously do some work in terms of analysing which themes or templates you're using. So, make sure you haven't got one with loads of bloat, and takes ages to load, and is really, really, really slow. And so, you can select things that are going to give you a head start out of the box.

But I think if you're working on a site that's already there, you can do a lot of work in terms of image optimisation, so, like, compression. So you make sure you do it, you can try and reduce how many third-party scripts and fonts that you're using. And then I think one of them does a lot of stuff with Cloudflare, and it plays quite nicely with it, but the other one doesn't. I don't remember which way around it is. It's sort of jumbled up in my head. But it might be the, I think, which Shopify you can, but with Squarespace, maybe you can't in terms of using something like a CDN that will just propagate the site faster.

MC: Yeah, absolutely. And so, I did notice as well that a lot of this will depend on, so it's generally the front-end stuff that affects the page speed for end users. And there are a couple of resources online now where they rank Shopify themes by how they score with Core Web Vitals. So, that can be a really good place to start because the platform itself can be fine. But if you use a bad kind of theme for it, then that can ruin things for you.

So, just be careful in terms of what you choose for the theme. And look for the resources where people are actually recommending them, because there only so much you can do. Obviously, I mean, you can't change the back end of these. The front end's the only bit that's under your control.

I appreciate, by the way, obviously, everyone that's submitted these questions, English isn't your first language. So, I appreciate that, this is fine, go ahead, ask a question even if you're worried about your English. I promise you your English is better than any other foreign language I can attempt to speak. So, keep going for that.

So, we're essentially asking here, basically, is duplicate content still a thing we need to worry... Well, was it a thing? Is it a thing? Do we need to worry about it?

AH: I think I've seen a lot of clients have been not manipulated through fear but sort of said, "Oh, you've got duplicate content in a site. You're going to get a penalty." But I just think those days are well past us in terms of at the sort of scale that a lot of the clients that I speak to, they're not going to have an issue.

I think that the duplicate content thing was Panda, wasn't it? That was the update that was set up to try and... You had these massive content farms with really low-quality content going out, like thousands, and thousands, and thousands of pages. And so, there was an algorithm update to take care of that sort of low-quality content. I think duplicate content is a sort of subset of that. But I think that you can just resolve a lot of this stuff with using rel=canonical because sometimes there is a good reason for having more than one version of a page, as you said.

So, it's something that we would focus on and look at. So, if we had five URLs, they're all the same, we would sit down with a client, then say, "Well, actually, which ones of those are needed? Is there a reason for five being needed?" and if there isn't, see what we could do to remove some of them, maybe redirect them or just put a rel=canonical tag in.

MC: Yeah. I think the important thing is to notices is that only in extreme cases is there a duplicate content penalty per se. So, if you're just mass scraping and republishing sites, then, sure, Google's probably going to take action against you.

In terms of, oh, some of our pages are identical or even, we've got copied pages from other sites for whatever reason, you're not going to get a penalty for that. All that happens is Google will filter out and choose one result. And I think I spoke about this the other day specifically with images because the same applies for images, right? If you do a search in Google Images, of course, Google doesn't show you the same image over and over again, right? It's just going to show you unique images.

So, the point I made was if you're using stock imagery that hundreds of other websites are using, it's unlikely you're going to win the image lottery and be seen in Google Images. And the same is true, in terms of duplicate content, it's not a huge deal. I just wouldn't expect it to rank well if you've got duplicate content unless you're, obviously, the original author and someone else has copied you. But that's a different kettle of fish.

I think related to this is faceted navigation. What in the heck? How best to deal with complete chaos? What a brilliant question: How to deal with chaos? So, faceted navigation, for those that don't know, is, for instance, an e-commerce site, you're trying to buy shoes, and you've got, you can filter and change what you're seeing by maybe size, price. You can change the order of products. So, there's two bits of guidance I'd give here, which is, one, these filters and facets exist because they're helpful to users. So, they can share what they're seeing on their screen with someone else. So they can say, "Look at these trainers," and you can see what they're seeing on the screen. So, those URLs need to exist.

It doesn't necessarily mean that a web crawler Google bot needs to be able to access that URL as well. So, in my case, I would recommend two things, which is if they're really specific filters and facets that are things that you just don't think there's any search volume for or it's ridiculous, let's say, reds, trainer... Red mans trainers under 39 pounds, you might want to think, "Well, people search for red trainers, maybe. So, I'll have that as a crawlable category. People search for men's trainers, I'll have that as a crawl category."

And you might decide that under 39 pounds isn't one. So, the way you can build the site is to not have those links actually discoverable by search engines. You still use canonical tags on them in case someone decides to link to them. So, you get the equity go to the right place. If you can't change the actual site, then that's when you start implementing robots.txt and noindex directives as well to try and carve off... Because what you don't want to happen in the worst-case scenario is robots crawling tens of millions of variations of a page. So, rather than seeing all your core products, they just end up down this big rabbit hole.

So, that's a complex question. There's some best practice if you're lucky enough to be building a site you can go with, which is why we try and say get SEO people involved to begin with; otherwise, it's a robots.txt, noindex answer. Don't know if you want to add anything to it, Alex. I think I hammered that one. Sorry.

AH: No. I think it's great. What in the heck? I think you un-hecked it.

MC: Have we got any questions you've seen in comments we want to tackle?

AH: There was, "Kindly explain BERT." And I thought that you-

MC: "Kindly explain BERT."

AH: I thought that was a really good one for you, Mark. But, no, BERT is the way where Google tries to understand a search query based on its context, so it can go forward and then backward. So, if you've got a search term for men's... heavy on the men's shoes. Let's go for... So, it's, like, good value, men's shoes in London. The search term could be men's shoes, but then the context is then before and then after it. And is BERT the way that it sort of like tries to understand that query?

MC: Yeah. Pretty much. I mean, yeah, so BERT stands for Bidirectional Encoder Representations from Transformers. I mean, what more do you need to know from that, really?

AH: Okay.

MC: Yeah, no, you're absolutely right. So, it's more to do with Google actually understanding the intent of the search query. So, they gave some examples about complex searches, like someone from Brazil searching for US visa. And that's a very specific nuanced search that Google needs to know, actually, although they're searching for US visa, that's the main entity here because of those very important linking words, from and things. That needs to say, okay, they're from Brazil. So I need to make sure it's someone foreign coming into the country. They're the type of results, and they gave some really nice before and after results, because without understanding of intent, it allows Google to show a more diverse set of search results. So, I think they'll become less reliant again on links and main keywords.

And I would add this: There's no real way for you to optimise for BERT really in my opinion. If you're writing for humans, you're writing specifically. You're doing your keyword research anyway. The idea is that this is the technology at Google. And actually, Bing has been using it for even longer than Google, but it should be a way that they can just understand what you're writing about.

Here we go: "How do we test a PWA, whether it's renderable by Googlebot or not? Any specific tool to do so?" I mean, my shout on that would be just always to put it through the web rendering in Google Search Console. As far as Google is concerned, they've said that rendering engines, that's using their kind of Chrome instances, is as close as you're going to get.

I would point out, it's a really technical detail, but Googlebot itself is the crawler that's essentially going around the web looking at raw html and gathering links. Googlebot's not rendering JavaScript or CSS. Googlebot's sending stuff back to Caffeine, which is Google's rendering service, which then actually executes the JavaScript and CSS and tries to understand what's on the page. But I think that's the best answer I can give for that one.

Let's have a look at our pre-questions as well. I'm aware we're about half an hour now. Are you okay for time, Alex? Do you want to take a few more?

AH: Yeah, for sure. Sure. Someone just dropped something off, heard.

MC: Oh, brilliant. Cool. So, "I rarely, or never see a target word used in H5 and H6. Is there any proven tests with results?"

Alex Holliman: Oh.

MC: I'll get this one. So, I've just been opted for this one. So, H5 and H6, so, header tags, generally what we've seen when we make a webpage, that we have this bigger text to say what the page is about, and then different subsections.

I think it was the... I can't remember if it was the... I think it was a Searchmetrics study that actually showed there was a negative correlation between ranking and using header tags. Now, obviously the correlation doesn't imply causation. We know that. But in my book, header tags, anyway, aren't, like, a big ranking factor. It's important to note that Google, as I said a minute ago, is rendering the CSS. So, even if it's just normal text, if it's bigger on the page, Google will know that, and it will class that text as more important. So, I'd say the nitpicking of is it an H1, an H2, an H3, or is it just text is less important than it's ever been.

I think the impact of H1, H2s, H3s is fairly limited anyway. It definitely helps, and optimisation is the sum of small things. But I think by the time you're down to H5 and H6, I've kind of lost interest. But I'd like to know about you.

AH: Absolutely. We sort of tend to do H1, H2. But very often, for the sort of clients that we are working with, it's more important to have a nicely branded message on page as a H1 or a H2 than having something that's just blatantly keyword-stuffed and has a really sort of negative user experience, so, as always, sort of like a balance in terms of how far you go with your optimisation.

MC: So, I'm just looking through. Here we go: "Are paid guest posts good for SEO?"

What about you, Alex?

AH: Well, if you own... No, I won't say that. Well, it depends. I think that historically, so, there's an ever-creeping sort of an evolution to SEO. And so, 15, 20 years ago, what was great, and really, really worked, and was easy, then as an industry, everyone moves towards and then does a whole load of it. And then slowly through time, the quality sort of comes down. And then we all have to raise our standards either because of something that, say, Google is doing and saying or because there's people in the industry doing really tremendous work that we all then aspire to try and do.

So, I think that in terms of guest posting and that kind of stuff, it's not really a sort of go-to thing that we do for a lot of our clients. And the value of it is more for the audience base that you'll get. But it's not a massive thing, I don't think. But undoubtedly, I think it would work.

MC: Yeah. I mean, so, we don't do any paid linking. And as Alex said, I bought loads of links throughout my SEO career. And, again, as Alex said, a long time ago, it's pretty it was the only way you were going to get to rank for competitive stuff. Now I think if I was doing an affiliate site, for instance, and maybe some not building a brand per se, I would be much more inclined to look at buying links and stuff because, at the end of the day, they're Google's rules; they're not law, and you can choose to do whatever you like.

For building a brand, I feel that Google's done a fairly good job about which links are working and which links are not. And I just think that if even if you don't get a penalty and those paid links lose value, you don't know. It's just lost equity that you could've put into something else. So, the money you're spending on buying links, that will have a life, shelf life before those links get found, because they will eventually get found and devalued, you could've put into something else that's going to have value forever, like content.

The only excepts are possibly if you're in very specific niches, and you're very limited on money to start up. You could maybe use it as a kick-off strategy. But, again, it's not something I'd recommend because there are risks to it. And I'm always straight up with people about as long as you understand the risks of what you're doing, it's your decision at the end of the day. And I'm not saying paid links don't work, because very-well-operated PBNs, I've seen, personally, work very well. But then the money you're talking for those is, like, "Talk to us if you've got eight, ten grand a month to spend." And for that, you can be doing decent stuff for your users.

AH: Yeah, and you don't want to be the agency that gets a client banded from the index…

MC: Yeah, for sure.

AH: …or that kind of thing. It's just sort of like reputationally you'll be challenged.

MC: "How do you make blog schema markup dynamic across an entire website, i.e. the title and meta descriptions change dynamically based on the blog post's URL." So, I've seen a couple of similar questions in here. So, I just wanted to highlight for those that haven't discovered it, Google Tag Manager. So, Google Tag Manager is a system that will allow you to dynamically insert JavaScript on your page and modify elements to it.

So, if you don't have access to the back end, because the best answer is you make the back end system do these things you want to do dynamically, if you're asking this question, I'm assuming for whatever reason, you can't do that, so my advice would be to use Google Tag Manager because you could set up rules based on the URL you're on. And you can use JavaScript to manipulate the page title, the meta description. You can pretty much manipulate anything in the DOM with that, and the same for Vincent's question, here, which is "The best way to add a noindex to a list of URLs without adding it manually to every site?"

Again, the answer would be in the back end. But if you can't do that, you can use Google Tag Manager to do this. So, and this will give you insight into this crawling and rendering series. So, you'll see the page get crawled by Google. And it won't see the noindex tag because it's added by JavaScript.

When that page gets rendered by Caffeine, Google will be like, "Oh, it's got a noindex tag on it. So, we'll actually, then, drop that page from the index." So, you can, again, dynamically just put a list of URLs into Google Tag Manager. It's tons of tutorials online on how to use Google Tag Manager. So, I won't really go into that. Should we do one more before we finish?

AH: Yes. And, "Is SEO dead, Mark?"

MC: I wasn't sure if this was a troll question. I think this is a troll question. But go for it.

AH: I think, like all things, there's a cycle, and there's a constant process of dying and being reborn. And so it's just a transition. So, in some ways it is, but in some ways it isn't.

MC: No is my answer, or everyone wouldn't be here. So, I really enjoyed that. That was fun. I think we'll do another one of these if you're up for it.

AH: For sure. I think it's fun.

MC: For sure. So, and I promise I will get everyone off to a smoother start. And we'll start on time next time because, as I said at the beginning, apologies for starting late. Apologies for the 301 redirecting you over to a new event because this was my first rodeo on LinkedIn Live. And, obviously, being live, you can't really test it unless you're live. So, I just had to go for it. I know what I'm doing now. So, maybe we might do this as a regular thing. We've still got a whole bunch of questions that came in here…

AH: And that's what I was going to say. We could try and keep hold of these questions and then pick up on some next time.

MC: Yeah, absolutely. Powerful night. Thanks, everyone, for joining in. Really great to hear your questions. I hope you've found something useful from here. And we'll leave the recording of this episode live up as well. You can check out our websites. We're at withcandour.com And Alex is at climbingtrees.com. Brilliant. See you, Alex.

AH: Thanks, everyone. Cheers, Mark. Cheers, guys.

MC: I hope you enjoyed that LinkedIn Live session. We are going to be back on the podcast on Monday, the 26th of April for our final episode of the month. And as I said, if you connect with me on LinkedIn, Mark Williams-Cook, you'll get updates to when we do our next live Q&A if you'd like to join in, submit a question, or maybe even if you'd like to come and answer some questions with us.

Until then, if you enjoy the podcast, please tell a friend, subscribe; and look forward to you hopefully listening again. Have a great week.

More from the blog