Candour

Episode 101 - SEMrush IPO, DevTools Local Overrides, Shopify SEO and rich snippets

Play this episode:

Or get it on:

What's in this episode?

In this episode, you will hear Mark Williams-Cook talking about:

The SEMrush IPO: Will they be changing subscription model?

DevTools Local Overrides: Improve the performance optimisation feedback loop

Rich snippets: Sitewide quality factors impacting site rich snippets

Pagespeed insights: Updated to HTTP/2

Shopify hreflang: A new solution for Shopify's hreflang issues

Show notes

Dan Barker tweet about SEMrush filing to go public

https://twitter.com/danbarker/status/1366538243567681541

Release notes for PageSpeed Insights API and PageSpeed Insights UI

https://developers.google.com/speed/docs/insights/release_notes#release-notes-for-pagespeed-insights-api-and-pagespeed-insights-ui

SEO Roundtable article

https://www.seroundtable.com/rich-results-site-wide-trust-issue-31024.html?utm_source=tldrmarketing.com&utm_medium=referral

Dave Peiris tweet

https://twitter.com/davepeiris/status/1367117676603187205

Local Overrides in DevTools

https://tryblackbird.com/blog/local-overrides

Shopify hreflang

https://apps.shopify.com/multi-store-hreflang-tags

Transcription

MC: Welcome to episode 101 of the Search with Candour Podcast recorded on Friday, the 5th of March 2021. My name is Mark Williams-Cook, and in this episode, we're going to be talking about a whole bunch of stuff from SEMrush's IPO, to PageSpeed Insights using HTTP/2. We'll be talking about Rich Snippets and quality signals and Local Overrides in DevTools for trying to improve your site speed. Before we kick-off, I want to let you know, this podcast is very kindly sponsored by the lovely people at Sitebulb. If you haven't heard of Sitebulb, it's a desktop-based SEO auditing tool that you can use on Windows and Mac. I've been really surprised by the amount of, sometimes, quite experienced SEOs that haven't tried it out yet. It's absolutely fantastic.

They've got a deal with Search with Candour listeners, which means if you go to a sitebulb.com/SWC, you can actually get an extended 60-day trial of their software. There's no credit card or bank details or anything required for that, so you can try it, no hands tied. If you don't like it, you don't have to continue using it. But it's absolutely fantastic. I've used it for many years, so I was delighted when they wanted to sponsor this podcast because it's something I absolutely don't mind talking about. And I normally cover one thing Sitebulb does that's helped me, and I haven't got stuck so far yet, because every week it's helped me out or I've thought about something else that it's helped me do that I can talk about. And this week is no different.

One thing that I got helped with this week with Sitebulb was, I had someone actually who wasn't a client ask me about getting errors in Search Console for their site not being mobile-friendly, but when they were actually loading the site on their phone, they could see everything looked fine. And one of the things that Sitebulb really helpfully does is, it will give you feedback if things like CSS, JavaScript, aren't accessible to crawlers. And this was actually the reason why this person was having this issue, which was that Google couldn't actually access some of their CSS and JavaScript. So, when they went into Google Search Console and did a fetch and render of the page, it looked unstylish, which meant a lot of the links were all crammed together, which is why Google thought the page wasn't mobile-friendly, or at least it certainly was to Google.

I didn't even bother because I didn't have access to their Search Console. I didn't have to do any of that. They said they had an issue. I kept on going with my day, just stuck it in Sitebulb and then loaded up the hints and it was pretty obvious to me what was going on there. So, an absolutely fantastic bit of software; it does whole bucket loads of stuff that will help you with your SEO. Give it a go sitebulb.com/SWC.

SEMrush, the SEO tool that probably most of you working in SEO will know, use, or at least have heard of, has filed to go public. And while there probably isn't anything particularly actionable for us to do as SEO is about this news, I think it is really interesting. Because SEO tool companies don't usually make it to this stage. I actually found out about this through a series of tweets from Dan Barker and are linked to his Tweet in the show notes, which you can get at search.withcandour.co.uk, but he's posted some screenshots and highlighted some numbers, which I thought were really interesting, which I'll just go through very quickly now out of interest.

SEMrush spent $54 million on marketing last year for revenue of 125 million. And from that, they made a gross profit of $95 million and a net actually loss of $7 million, which isn't too surprising, kind of pushing through the growth that they're seeing.

So they state, they have 67,000 customers. And if you divide the hundred and 25 million profit by 67,000, it would mean an average of $155 per customer per month spend. Of course, that number down gives the caveat that this ignores the growth over the year. So he just did the total revenue divided by the number of customers. So that's probably not quite right, but it gives you a nice estimate.

The timeline states, they passed 50,000 customers in 2019, meaning roughly 10 to 15,000 customers added across 2020. The sales and marketing costs increased 31% last year with an extra $5 million in staff costs and 7.5 million in additional online advertising costs. The SEMrush online learning program had 300,000 sign-ups and 130,000 completions. And the marketing and sales team alone is 308 people, and there's 980 staff in total, meaning almost one third were in the sales and marketing.

I thought there were some really interesting numbers to discuss. I was listening to Judith Lewis yesterday on Clubhouse. She was talking about some SEO stuff, of course, and she raised one really interesting point, which was at the moment, SEMrush is a pay monthly tool. So you can subscribe to SEMrush, and pretty much cancel your monthly service with them at any time. She raised a really insightful point that, if the company goes public and they need to start providing shareholders with more information and forecasts, that it's likely they might move to a model similar to other tools, where you have to buy a year of service. And you then at certain points, during the year have exit points.

I know I used to be a Searchmetrics customer, and this is exactly what they did. You'd sign up for a year and you would have a window to cancel near the end of that year. And if you miss that, that's tough luck, you're in for another year. But that gives them the solidity they need, to make those kinds of forecasts that they need to do. I did ask SEMrush directly if they have any plans to change their subscription model from monthly to maybe a longer-term or a yearly subscription model, but they haven't replied to that. So, it's just something to bear in mind and I thought it was pretty interesting news to discuss in the SEO world.

I've got good news for you, which is that your page speed score is probably going to be going up around now. And that's due to some news that was published on the Google Developers site for PageSpeed Insights. Again, our link to it in the show notes at search.withcandour.co.uk. And this is essentially that the PageSpeed Insights tool is going to start using HTTP/2 to make requests. So, if your server supports this, it's likely you're going to see an improvement.

So from March the 3rd 2021, this is the announcement from Google. PageSpeed Insights uses HTTP/2 to make network requests if the server supports it. Previously, all requests were made with HTTP 1.1, due to constraints in connectivity infrastructure. With this improvement, you can expect more similarity between Lighthouse results from PageSpeed Insights and from Lighthouse CLI and DevTools, that's Command Line Interface, which has always made requests with HTTP/2.

However, it's important to keep in mind that different environments, hardware, and connectivity, will influence measurements. So across environment consistency is near impossible. With this change, network connections are often established quicker. Given your requests are served in HTTP/2, you can likely expect metrics and the performance score to improve. In general performance scores across all PageSpeed Insights runs, went up by a few points.

If your page does not support HTTP/2, the report will now show an audit to that estimates performance improvement if the page were to support HTTP/2. So, a little bit of interesting news. It's always easy to miss these small announcements and then maybe leave you scratching your head if you see a change and you're not sure why it's come. As I said, if you've been running the tools via CLI or via your browser, doesn't make much of a difference anyway, but if you are using the PageSpeed Insights tool, you will likely see a few points increase.

I have a very short piece of what I think is very interesting, intrigued to share with you, around Rich Snippets. I picked this up from the Search Engine Roundtable. It actually came from a hangout with John Muna, where someone was asking questions around Rich Snippets. Specifically, they were asking why they weren't getting Rich Snippets results, even though they were sure the technical implementation was correct.

I'm going to quote here a part of John's reply to this and then talk about why I think that's important. So John said, "The last one is more of a general usually a site-wide signal that is about the quality of the site overall. Like, 'Can we trust this website to provide something reasonable with structured data that we can show in the Rich Results?' And usually what happens when everything from a technical point of view is set up correctly and we've had enough time to process it for indexing and it's still not shown, then that's usually a sign that our quality algorithms around the Rich Results, in general, are not 100% happy with your website."

So there's a couple of interesting things here. First, just the usage of the term site-wide signal. Garish particularly has said, "Google doesn't really have anything that correlates to a domain level like trust score if you like." Obviously, domain level and site-wide are two different things, because you can have multiple sites on the same domain and across sub-domains, et cetera. But it's interesting because we usually talk about signals and we usually talk about ranking on a page level. I don't think this is John being sloppy with the language he uses. So, it's interesting that he's saying there are some site-wide signals in play, and that these can affect how many of Google's features you're going to be eligible for.

I don't think anyone can give you any specifics there apart from do the things that we know are best practice around content downloads, et cetera. But from a technical SEO point of view, it's worth, I think, filing that away in your mind that, we can now fairly safely say that, there is some kind of scoring going on at least site-wide, if not domain-wide site-wide with Google. I'll leave it at that, is just something for you to think about.

This is how we can build our knowledge with these little bits of information and not run away with them and make lots of conclusions, but keep them there in the back of our mind when we're trying to diagnose issues.

We've already spoken about site speed in regard to the PageSpeed Insights earlier in this podcast. And what I want to cover now, is something that I think is pretty exciting. I hadn't encountered it before. Maybe if you're more of a recent developer, you already knew about this. It's about testing site speed changes with Local Overrides. And I hadn't come across Local Overrides before, and I spend a lot of time doing SEO. So if you haven't heard of it, you are most definitely not alone. I'm quite confident, quite a few SEO's won't have heard of this. I found this from a Tweet from Dave Paris. Sorry, Dave. I think that sounds pronounced your surname. I don't think I've not got it, but do let me know if I've got it wrong. He's written an article about how to use Local Overrides in Chrome DevTools.

Again, as you can probably guess it'll be in the show notes, search.withcandour.co.uk. I'm going to read a couple of snippets from the post to give you a taste for what it's about. So, this is from Dave's posts, "Improving site speed can be complicated. It's surprisingly common to roll out a change that should lead to a performance improvement, only to find that frustratingly things have gotten slower." And actually, I had this exact experience yesterday, the other way round, which was, we removed some page speed modules from a site and it actually started scoring better.

So, absolutely agree with Dave there that, improving performance, improving site speed, can be really complicated detail-orientated work. So he goes on and, "A lot of the time, this is how sitespeed works. You have to approach it like you're running an experiment, where you benchmark your baseline, make a change, and then test to see if you've made an impact. But this approach can be slow, especially if you're having to ask developers to make these changes, and then they have to find time in their schedule to implement them, and then potentially roll them back. In situations like that, you have a slow feedback loop. It takes a long time between, 'Let's see if this works,' and finding out the results."

"A fast feedback loop, Chrome DevTools has a feature called local overrides that can drastically improve that feedback loop. Instead of having to wait for a developer to add your change, you can first try it out on your local machine to measure the potential impact. It works by saving a copy of the page you're working on or any other resource like JavaScript or CSS file, letting you edit that. And then, serving that file instead of the live version. Here's how I test whether site speed changes might work without having to push those changes live first."

So then Dave goes through, gives a really nice tutorial with screenshots about how to set up Local Overrides, find your baseline. He gives a little script to test the largest content full paint speeds. So you're getting that directly in your Chrome Console. So you have your site, you run this test locally three times to get an average of this LCP. You can then make your changes locally, and then rerun these tests to see if you're getting any improvement. And then there, obviously, it's still may not be exactly the same when you deploy it live, but it does give you a much better chance. And as he says, "You can iterate through this feedback loop a lot quicker." He's even given example sites in his tutorial, where he's done this and you can see the improvement. So again are linked to this tutorial in the show notes, really worth checking out, have your dev team look at it as well. It might be something that you can do to just speed up. As Dave says, "Speed up this feedback loop," And help you get these site improvements live a lot faster.

It's not the first time that Shopify has come up on the podcast that discussed it before, it's really common. I get loads of questions about Shopify every month from clients, from random people as well, asking about Shopify and SEO. Like many of the major e-comm platforms and major content management systems, it's definitely got better over the years for SEO. Shopify definitely still has its foibles, its weak spots for an SEO. As far as I know, they still don't have the ability to natively edit the robot stock text. A lot of the templates that you'll get with Shopify as well. Will build your URL structure. That means the individual product URL, will also include the category or as Shopify call them, the collection in the URL, which is fine until you start getting a single product in multiple collections. So for instance, say you were selling furniture and you were trying to sell a specific wooden chair, and it was in two collections.

So, you might have a collection called chairs and you might have a collection called kitchen, for instance. And if you listed that product in both categories, you would get a /kitchen/chair URL and a /chair has /, whatever the chair URL was. And Shopify, the default behaviour is normally to apply a canonical tag. Obviously, this works, but it's not optimal because then you've got all these extra URLs that exist that need to be crawled. You've got the issue of if people link to non-canonical URLs that the canonical tag you're providing is only a hint. So it can get a little bit messy. One thing that I saw this week come out that will potentially help with one of these foil balls. So, if you have a Shopify store, your ears should be perking up, is Multi‑Store Hreflang Tags.

So Hreflangs, as we hopefully all know, are the tags that we can use to tell search engines like Google, which version of a page is for which region, and which language. It's particularly helpful if you are serving multiple regions in the same language, to the most common example is if you are dealing for instance, with pages in English, but you have different stores for the UK, for the U.S., for Australia. Because all of those, ideally we'll be showing products in different currencies. And actually, if you really are going for an international approach; doing it properly and localizing content, you should be using, for instance, the different spellings between English and American English, and actually rewriting the sales copy sometimes because while we share a language with the U.S., culturally we're quite different and sometimes different copy will have a completely different impact between the UK and U.S.

Now, all of that's fine, however, again, most Shopify Stores will handle Hreflang Tags using a canonical URL value. And this creates a problem because it means you have to have the exact same collections, products, pages, blogs, blog tags, and articles, in each store and all with the same URL handle. Otherwise, the hreflang tags that are generated by Shopify, will actually be created and point to non-existent pages.

So this means as well if one store doesn't contain all the same products, which can be quite possible for different shipping reasons or licensing issues, for instance, it means, again, you're going to be creating tags that Google is going to try and crawl to non-existent pages. And when these Hreflang Tags start to break down, it's normally a bit of a domino situation. So to be valid, they need to be bi-directional as well.

Even if a returning tag is correct, if the one going out is pointing somewhere else, then the whole thing becomes invalid. So there is an app, I should say, on the Shopify app store now called Multi‑Store Hreflang Tags, and it's by Digital Darts. Is $27 a month, so it's not super cheap. But if you are running multi-lingual stores over different countries, I would definitely look at it. Because this app fixes a whole load of problems. It works for all content types, collections, products, pages, blogs, blog tags, articles, and they're all matched.

So, these stores will have perfect Hreflang Tags. You've got a 14 day free trial on that app as well, so you can kind of install it on your site and see if it works. Haven't got any connection to this app. I'm obviously talking about it quite a lot. I actually, again, encountered this during my looking around the web. So I haven't personally used this yet, but it does look like a very good solution for an issue. I know has affected many people that are running Shopify Stores. So, check it out. I'll put a link again in the show notes at search.withcandor.co.uk.

And that's everything for this episode, we've actually covered quite a few bits of small news. This episode, normally we'll normally tackle two or three longer bits, but I just had a lot of itty-bitty bits that I thought were interesting and important for this episode. We're going to be back on Monday the 15th of March. So listen again, tune in again, share the podcast if you enjoy it.

Thanks for all the great feedback. I've had some really nice feedback on the podcast recently. I do really appreciate that. Especially during lockdown, it does sometimes feel a little bit lonely doing this when I'm not interviewing people. So it's nice to know that people are enjoying it, finding it interesting and finding it useful, which is what I set out to do. So apart from that, I hope you all have.

More from the blog