Aardvark SEO Addon now released for Statamic 3
The Aardvark SEO addon is now ready for primetime with Statamic v3, bringing...
The Unsolicited #SEO tips series started on LinkedIn but unfortunately got too big for the LinkedIn article system, which is why we're now here!
If you haven't seen the unsolicited #SEO tips series before, I basically post a single SEO tip every day on my LinkedIn and then curate them into these posts for every new 100 tips!
If you haven't met me before, then "hi!". My name's Mark Williams-Cook and I'm a director here at Candour, I run the free SearchNorwich meetup and the Search with Candour podcast.
I'm really proud to say, we're up to more than 500 tips now: That's more advice than you would normally find in a whole book about SEO, right here, for free!
Enjoy!
"We'll produce 2 blog posts a week of 500 words". If your SEO strategy sounds similar to that, I can pretty much guarantee you are wasting your money.
If you modify the contents of a page with Javascript, you may find vanilla (non-modified) HTML is what appears in the search results for some time. While Google can process Javascript (they are doing so in a median time of 5s), it can sometimes take a while for what is showing in the SERP to catch up! You can find out more about getting Google to work with Javascript here.
Google's "mobile-first" index means they are looking at your site as if they are on a smartphone. This means if you have a "mobile version" of your site that has less content than your desktop version, it is unlikely to get found. You can find out more about mobile-first indexing here.
Never use automatic Geo-IP redirection to push users to different location versions of your website. It will confuse search engines, it's a bad user experience and it will actually be illegal under EU law from 2nd December! Here's some more info on internationalisation.
Nobody can tell you the keyword a particular user searched for on Google and ended up on your site from organic search, despite what some tools claim to be able to do.
Adding new content to be "fresh" is a myth: it does not apply universally. Some queries deserve freshness, others do not. Don't add new content for the sake of it being new.
If you want to outrank everyone for seasonal terms, something like "best christmas laptop deals 2018", then keep it on the same URL every year (e.g. best-christmas-laptop-deals) and just change the year in the content. If you want to keep the old content, move that to a new URL (e.g. best-christmas-laptop-deals-2017).
The words you use on internal links (anchor text) are massively important. Link to internal pages with the terms you want to rank - it can be more important than content on that page! Conclusion from a recent experiment: Even a website, where a keyword is neither in the content, nor in the meta title, but is linked with a researched anchor, can easily rank in the search results higher than a website which contains this word but is not linked to a keyword. More info on that study can be found here.
Any different URL counts as different page to a search engine. For example: If both www and non-www versions are accessible, Google counts these as different pages. In cases where content is identical and accessible through different URLs, you should be using permanent 301 redirects.
Even if you're not an e-commerce site or collecting information, all of your site should be https not http. It's good for protecting your users' privacy and as a thank you, Google counts it as a positive ranking signal :)
Contrary to popular belief, adding pages to robots.txt does not stop them from being indexed. To stop a page being indexed, you need to use the noindex tag. You can find more info from Google on noindex here.
Google completely ignores the meta keyword tag and has done for years. Don't waste your time writing lists of keywords in your CMS!
Your meta description does not directly improve how well your page ranks in Google. It does, of course, influence how many people are attracted to click on your result, so focus on that.
Google claim to treat subdomains and sub-folders the same in terms of ranking. There are some interesting challenges with subdomains, such as sites like wordpress.com where a subdomain can host content that the domain is "not responsible for" (in this case, random peoples' blogs). Some SEOs have cited examples where they believe subfolders have out-performed subdomains in terms of ranking with other things being apparently. It's worth questioning whether your site really does need a subdomain and what the benefits are.
Longer content does not necessarily rank better. Some studies may indicate this, but when you look at the source data, it's just because that content is so much better (and there is a higher probability longer content has had more effort invested). The web is not short of quantity of content - it's short of quality. Answer questions and intent as quickly as possible, then get into the detail if needed. More tips on what quality means this week :)
Content does not just mean text! Sometimes a picture says 1,000 words and a video says even more. Google 'learns' what type of content best answers queries and you can get great clues as to what type of content to create by seeing what is already ranking E.g. "How to change a car battery" is all videos ranking top - after the short text content! (See yesterday's tip on content length).
Back to basics. If you don't have one, a free Google Search Console (formerly Webmaster Tools) account will give you a wealth of diagnostic information directly from Google about your site, alert you to problems, penalties, hacks and give you average rankings and keywords your site is showing for. Here's some more info from an experiment of following a Googlebot for 3 months.
You can download free browser extensions, such as User-agent Switcher for Chrome that will let you identify yourself as Googlebot to the websites you visit. It's really interesting seeing how some websites will deliver you a difference experience when they think you're Google. It's also really useful for uncovering what is going on if you get a "This site may be hacked warning". It's very common for hacked websites to appear "normal" to a regular user, but when they detect Google visiting the site, they'll show lots of hacked content and links to benefit the hacker's sites!
You can use search operators to get additional information about your site from Google. For instance, try a search for: site:yourdomain.com and Google will show you how many results it has indexed* for your domain and will list them roughly in order of importance**. You can also use this to see which specific page Google likes on your site for a specific keyword or keyphrase by conducting the search: site:yourdomain.com keyword *It's not 100% accurate, I've actually seen some wild variances. The only way to be sure is to check in Google Search Console, but this method works with competitors or sites you don't have access to their Search Console. ** This obviously has no keyword context but may have some other context from your personalisation, search history, device etc. It's more a rough guide for interest and is usually expressive of your internal linking.
There is almost no case in which you or your agency should be using the Google Disavow Tool. You'll probably do more harm than good. This tool is only for disavowing links when you've had a manual penalty or specifically when you know blackhat/paid links have happened and you want to proactively remove them. In 99% of cases, just 'spammy' links should be left - if Google thinks they're spammy they'll just ignore them. Focus your efforts on creating and positive things, instead.
If you are moving your site (full or partial), DO NOT use the Remove URLs tool in GSC on the old site. It won't make the site move go any faster. It only impacts what's visible in Search so it could end up hurting you in the short-term.
You can use the service such as Visual Ping to automatically monitor things such as Google's Webmaster Guidelines pages to get an email when they have been updated. For instance, Google recently added this to their definition of 'Link schemes', to include terms of service and contractual arrangements:
First words are important words. If you're going to put your brand name in your page title, it usually should come after the description of the page content.
If you're having to do redirects, don't do them at "database level" - e.g. in the backend of your CMS. Your site will be faster if you do the redirecimts "higher up", such as in the htaccess file. Faster sites are good for users and rank better.
It's a common mistake to use robots.txt to block access to the CSS / theme files of your site. You should let Google access these, so it can accurately render your site and have a better understanding of the content.
"Keywords Everywhere" Chrome plugin is a nice, free way to get search volumes, search value and suggestions overlaid with every search you do. I have it on all of the time and over the months and years, you build a good 'feel' for search competitiveness and how other people search.
Pop-ups and interstitials generally annoy users, you too, right? Since Jan 2017, Google has specifically stated that websites that obscure content with them and similar will likely not rank as well. Here are some examples from Google of things to avoid.
Competitors copying your content? One of the many things you can do is file a DMCA notice directly with Google. This can remove your competitor's content from search results. Here is the link to file a DMCA notice. N.B. There are consequences for fake DMCA notices.
If you're AB testing different page designs on live URLs, make sure you use the canonical tag so you don't confuse search engines with duplicate content while the test is live.
Ayima Redirect Path Chrome plugin allows you to live view what redirects (JS/301/302 etc) are happening and if you're getting chains of them.
If your web page URLs work both with and without a trailing slash (/), search engines will think you have two identical copies of your website online. You should pick whether you want a trailing slash or not and setup permanent 301 redirects between each. Failing to do so, can result in the 'two' pages competing against each other in the SERPs and ranking worse than a single one.
[Deprecated!] No longer an indexing signal for Google, but is for other search engines
If you have paginated (page 1,2,3,4 etc) content, there is special "Prev" and "Next" markup you can use to help search engines understand what is going on, better. More info on prev and next pagination markup here.
Google has always flatly denied and there is no good evidence whatsoever that social media posts on platforms like Facebook and their associated 'likes' and/or engagements directly impact your rankings in any way. If someone is insistent about this, look closely - you may be dealing with a clown! 🤡
Unsolicited #SEO tip: How 'old' a site is plays a part in how well the pages on it can rank - this is means all on-site and off-site facts such as how long links have been present. Older is better.
Want to know where you rank? Googling it will just frustrate you. Even with incognito mode, you're not going to get a fair representation of the rankings. Google Search Console will give you some average rankings - but only for terms they choose. I'd recommend SEMRush. It's super cheap and will give you loads of keyword ranking data for your site, your competition and specific terms you want to track. You can get a trial here.
Add a self referential canonical tag to all canonical pages. This means if someone scrapes your content or links to it with query strings, it's still clear to Google which version to give credit to. More info about self referential canonicals here.
You can find easy link opportunities by using tools such as Majestic's Clique Hunter. Specify a few competitor sites and you'll get a list of links that all or most of your competitors have that you do not. This helps close the gap on where all your competitors are being talked about and you're not. More info from Majestic directly here.
Watch this brilliant video from Lukasz Zelezny on SEO tips you can implement tonight! https://www.youtube.com/watch?v=dXdqmVnP5pg
Using a VPN is a good idea in general, but it's really helpful for SEO. With a service like ProtonVPN then you can click to change cities or countries and see what different search results look like.
Google can index PDF documents just fine and it actually renders them as HTML. This means links in PDF documents count as normal weblinks - PDFs are pretty easy to share, too....
This week, I saw a company that had been told by an agency their site was slow. It really wasn't (consistent ~3.0s TTI). You can test site speed (and more) yourself using Google's PageSpeed / Lighthouse audit tools. I'll do some tips the next few days about these tools, as they are commonly misused and misinterpreted. Here is Google's PageSpeed Insights. Here is Lighthouse.
If you use the page speed tools from yesterday, keep in mind that results can vary every test you run, depending on all kinds of factors. If you are going to use these tools, run multiple tests on multiple pages and get some averages.
If you have enough traffic, Google's Pagespeed Insight tool will give you "Field data" - this is an amazingly useful average speed, directly from your user's browsers. It will give you a much better idea of how your site is performing outside of the 'spot checks' we spoke about in previous tips. Google Pagespeed Insights tool is here.
Have you heard that 50% of searches by 2020 will be voice searches? They won't, it's complete rubbish.
1 in 5 searches in that happen in Google are unique and have never happened before. The vast majority of searches that are conducted are terms that have fewer than 10 searches per month. If you're just picking key phrases based on volume from "keyword research", you're missing the lion's share of traffic and making life hard for yourself, as lots of other people are doing the same.
Check the last 12 months in Google Analytics, if you've got content pages with no traffic - it's maybe time to consider consolidating, redoing or removing those pages.
Key phrases mentioned in the reviews written about you on Google My Business help the visibility of your company for those terms.
Google do not use UX engagement metrics directly as part of their core algorithm (CTR, dwell time etc). They have said this consistently for years and last week, Gary Illyes from Google referred to such theories as "made up crap" in a Reddit SEO AMA. However, there is the "Twiddler framework" that sits on top of Google's core algorithm, which is lots of smaller algorithms that do impact the end SERP. We have definitely seen SERPs change temporarily when CTR jumps, which is no doubt Google's way of trying to match intent from news stories etc.
As a last resort, when your dev queue is stalled and you're drowning in technical debt, it is possible to modify things such as page titles or canonical tags via javascript with Google Tag Manager. It can take weeks for these changes to be indexed, but it does work.
If you're serving multiple countries on one website, it is almost always better to do this with sub-folders, rather than sub-domains or separate TLDs. This means: mywebsite.com/en-gb/ mywebiste.com/fr-fr/ Is almost always preferable to: en-gb.mywebsite.com fr-fr.mywebsite.com.
You need other websites to link to your website pages if you want to rank well in Google. This means if you consider SEO to be a one-off, checkbox task of completing items on an audit, you are unlikely to see success. Technical SEO gives you the foundation to build on, not the finished article. #backtobasics
Have a play with Google Trends! It is useful to see trends in searches, when they happen every week, month or year. How much do they vary or are they trending up or down? Here's a funny trend for two searches (different Y axis) for searches around 'solar eclipse' and 'my eyes hurt' :)
You can do some basic brand monitoring for free with Google Alerts. This gives you the opportunity to do 'link reclamation' - when websites are mentioning your brand or website and not giving you that link. Strike up a friendly conversation, offer them some more value, detail, insight and get that request in to get the link :-)
Use schema. It's important. Here's a short video about it. https://www.youtube.com/watch?v=xYHK-laEhk4
Registering for Google My Business for free, is how you can start ranking in the local map box results.
While you can modify page content with javascript, such as with Google Tag Manager, this should be an absolute last resort. In this experiment I did, it took Google 24 days to render the javascript version of a page!
Stuck for good content ideas? Put a broad subject (like 'digital marketing') into AnswerThePublic and you'll get a list of the types of questions people are asking in Google!
Screaming Frog is a tool with a free, limited version, that allows you to quickly 'crawl' of your pages like a search engine would to see issues such as 404s or duplicate page titles.
Video is often overlooked, YouTube is the second largest search engine in the UK - there is more to SEO than just Google search!
Want a better chance that your videos will appear in search results? Then create video sitemaps! Video sitemaps give additional information to search engines about videos hosted on your pages and help them rank.
Don't stress about linking to other websites where it's relevant and useful to the user. That's how the web works and is absolutely fine!
Did you know that sending someone a free product to review and get a link is against Google's guidelines and comes under 'link schemes' that could land you with a penalty?
The factors to rank in the local map pack results are different to 'normal' rankings (but there is overlap).
How much is your organic traffic worth? One way to get a good estimation is to find out how much it would cost to buy that search traffic through paid search. The cost per click (CPC) of a keyword is set by market demand and can be used a barometer for the value of your rankings. Tools like SEMrush can do this for you automatically. In this example, the estimated monthly value of the organic traffic is £5,700.
If you want your images indexed, you need an img tag within the HTML.
Google Trends has a commonly-overlooked ability to trend YouTube searches.
Domain age, or at least the component parts of it such as how long links have existed to it, play a part in ranking. It is almost impossible to rank a brand new domain for any competitive term.
Examining your raw server log files can be a worthwhile exercise. You can see directly how Googlebot is interacting with your site and if it's getting stuck somewhere or getting error responses.
When doing a site migration, don't forget to migrate URLs not within your site's internal link structure. This could include links to pages with marketing parameters, for instance. These 'hidden' URLs contribute to your ranking, commonly get over-looked and result in permanent ranking drops after migration.
If you're trying to do a crawl of your site with a tool like Screaming Frog and getting 403 errors, this can be because many Web Application Firewalls or services like Cloudflare will default to blocking crawlers imitating Googlebot. Get around this by setting your crawl user-agent to Screaming Frog SEO Spider or another non-search engine one (or get yourself whitelisted by IT!)
Don't focus specifics in algorithm updates, if you're having to do that, your underlying SEO strategy probably isn't right. Algorithm updates primarily represent the overcoming of technical hurdles which are still driving toward the same end goal.
Golden rule of SEO - there is absolutely no 'SEO change' you should do on your site that will make the user experience worse. None. No exceptions.
Ideally, you just want just one h1 on the page and it should be descriptive of the page content for the user. Naturally, your page title and h1 will normally be similar.
The main functionality of your site, such as all the important pages should be accessible, without javascript. Disable javascript and have a click around your site. If things are broken or parts are missing, this can cause big problems for Googlebot!
Canonical tags are not a directive. Do not try and use them on pages that are not similar - Google will just ignore them. I recently confirmed this with an SEO test too.
Struggling to get interesting data to make a narrative to get links? Did you know Google has a Dataset Search? You can search for publicly available datasets to get inspiration and save huge amounts of time.
With a reasonable number of results, a 'view all page' is optimal over paginated content. Research shows 'view all' pages are also preferred by users. Google says: "To improve the user experience, when we detect that a content series (e.g. page-1.html, page-2.html, etc.) also contains a single-page version (e.g. page-all.html), we’re now making a larger effort to return the single-page version in search results."
If you're trying to stop content getting indexed, remember not to have it in robots.txt - or the crawler will never get to the page to see your noindex tag!
Cannibalisation is when you have more than one URL targeting the same intent / key phrase. It is one of the main problems that causes otherwise technically optimised sites with decent content to rank very poorly.
Making a visual crawl map is a fast way to get a bird's eye view of your content structure and see if you have any problems. One of my favourite tools to do this is Sitebulb. Neither Sitebulb or Patrick Hathaway compensated me for this post. They just made a really good tool. There's a trial version to check out.
If you want content to rank well over months/years, you need to design your site to link to it from 'high up' in your site hierarchy. It's generally a mistake to post evergreen content in a chronological blog, as it will slowly disappear deeper into your site, more clicks away. If it's evergreen and always relevant, it should always be prominent.
Google had a bug last week that caused millions of pages to become de-indexed seemingly at random - sometimes even big company's home pages. This bug is now fixed - so if it affected you, there is no need to panic, these URLs should resolve automatically. If you're in a rush (who isn't), you can speed up re-indexing by submitting the de-indexed URL via your Google Search Console account.
If you discontinue a popular model/product on your e-commerce site, rather than delete the page, update it to explain the product is discontinued and link to the nearest alternative products. This is more helpful to the user and prevents the loss of organic traffic.
A specific 'keyword density' is not a thing, so don't waste your time on it. Apart from the fact text analysis goes far beyond this and tf-idf, it means you're writing for robots and not for humans - and therefore missing the point. The algorithm is only ever trying to describe what is best for humans, so start from there.
www or non-www, pick one! Then redirect (301) one to the other. Did you know that Google and other search engines count URLs with and without www and different (and therefore duplicate) pages?
Despite what they profess, the 'build your own site' platforms like SquareSpace and Wix are not optimal for SEO. While they can be great start points, it's unlikely you'll get great rankings with those sites. Even bigger platforms such as Shopify don't allow you to edit your robots.txt file! Edit: June 2021 - They do now!
Do broken link reclamation. Check server logs or use a tool like Majestic to identify sites that are linking to malformed URLs. Set up 301 redirects for these to reclaim the links and get the extra traffic.
Do not underestimate the power of ranking in Google Images. A huge amount of searches are visual, so it is worth making sure your image assets are properly marked up and optimised.
The site: operator in Google is useful to see if you have major indexing problems, for instance if you have a 20 page site but find 5,000 indexed pages - or vice-versa - you have a 5,000 page site but only 20 are in the index. However! It will not give you an accurate count of the number of pages included in the index, so don't use it to try and measure index coverage!
If you're using schema, don't use fragmented snippets, tie them together with @id - e.g. this Article belongs to this WebPage, written by this Author that belongs to this Organisation, which own this Website - build the graph!
When doing a site migration, try and change as few things as possible. E.g. if you can do a move to http - https first, do that. It will make it easier to diagnose and fix the root cause of any issues.
If you don't have a strategy to get people to link to you, it's going to be almost impossible to obtain competitive rankings. Links are still the life blood of rankings. Here is a recent test example. The site does not rank for years. It gets an influx of links (top graph) and the search visibility shoots up (bottom graph). The site loses links (orange, top graph) and search visibility falls (bottom graph).
Google has just announced both Search and Assistant support for FAQ and How-to structured data. (h/t Andrew Martin). Find out more about Google's announcement here.
"The content comes before the format, you don't 'need an infographic', you don't 'need a video'. Come up with the content idea, then decide how to frame it" - Brilliant advice (think I got the quote right) from Stacey MacNaught last night at #SearchNorwich.
Dominating Google is about getting your information in multiple places not just your own sites. Or just making Google think you have 512 arms :-)
Part of being 'the best' result comes with format. Google is bringing AR directly to search results. Your product, in the consumer's home. Doesn't get much more powerful than that! Find out more in our podcast.
While it was never officially supported, Google has stopped obeying noindex commands within robots.txt. If you're using it to noindex pages your pages will now become indexed! You'll need to declare noindex either on-page or via X-Robots.
Got a showroom? It's not expensive to get a 360 photo done for your Google My Business and it will help you attract more in-store visitors.
Avoid a common mistake if you're targeting multiple countries/languages by making sure you use the 'x-default hreflang' on your region/language selector page.
If you're using Google Search Console and it looks like data is missing or you are getting "not part of property" errors, be aware - Google classes http, https, www and non-www versions of your site as different properties! Therefore, you need them all added to your Google Search Console and make sure you have the right one (the one your site uses / redirects to) when making changes!
Bounce rate is not a ranking factor. A high bounce rate can be good in some cases, it needs to be taken in context with searcher intent.
You cannot "optimise for Rankbrain" - 'Rankbrain' is the name of one component of Google search that specifically deals with queries Google has not seen before using AI to try and understand intent. Rankbrain deals with approximately 15% of queries (around 3,000 a second).
"Google has 200 different ranking factors, each with 50 different variables". Have you heard this? That's what we were told almost 10 years ago by Matt Cutts from Google. This is not reflective of how Google works in 2019 and someone saying this to you should raise a red flag - it's super out of date information!
Make sure you're only specifying hreflang with one method (on-page, sitemap, headers). I've seen numerous problems caused with conflicting tags - so check you're only using the one method!
Having an empty 'voucher code' box as the last step of your checkout can kill your conversion rate as you send people off on a wild goose chase to find one! It's always worth having a "[brand name] vouchers, offers and coupons" page - it will always rank first and if you have no offers on, you can let people know so they don't feel they are missing out!
Correctly categorising your business with "Google My Business" is vital to appear for generic map-based searches.
It is worth looking at the last 12 months Analytics data and seeing what pages you have that get no traffic and asking why. It's a great way to see what your content weak spots are, what needs improving, rewriting or sometimes - just deleting.
Don't add keywords in your Google My Business name, it can get you penalised.
The cache of your page (before JS is rendered) is based on the First Meaningful Paint. This means pages with loading screens/elements that last too long may be caching and Googlebot won't understand what is on your page.
I've mentioned cannibalisation before (many pages trying to rank for the same keywords) and how this can have a drastic impact on a site's ranking. Well, thanks to Hannah Rampton, you now have a free tool that you can use to check your site for cannibalisation.
The recent Google 'diversity' update that limited how many organic results one site can have, usually to 2, does not include 'special' results such as rich snippets or Google news etc. This means it's worth considering what other angles you can use to dominate SERP real estate!
Not sure where to start focus with? There are rarely 'quick wins' within SEO, but focussing on your content that ranks in position 3-10 can be the fastest way to get traffic, as most of it is locked up in those top 3 positions on a regular SERP. You can pull a report like this quickly with a tool like SEMrush (aff).
If you're really thinking about your audience, their intent and getting people that know the subject to write your content - you don't really need to worry about what TF-IDF is, or how it works.
Having an all secure site (https not http) using SSL/TLS is a great idea for many reasons - it is also a ranking factor in Google! Secure sites rank better and Google recently said it can be used to settle 'on the fence' rankings where most other things are equal.
Sometimes blindly following Google's advice is not in your best interest (in the short term, at least). Here is Lily Ray demonstrating traffic loss after implementing FAQ schema markup.
When calculating organic traffic at risk when completing a website migration, remember to only calculate from unbranded traffic - it is highly unlikely you'll lose traffic on brand terms during a migration. In some cases, this can be a significant amount of traffic and spoil your forecasts.
If you don't have GA access or you inherit a new GA account with limited historical data, you can find historical URLs of a site by replacing 'example.co.uk' with the domain you want from the link: https://web.archive.org/cdx/search/cdx?url=example.co.uk&matchType=domain&fl=original&collapse=urlkey&limit=500000
"Those aren't my competitors!" - You have both business competitors, who you are likely aware of - and you have search competitors - the ones that rank above you for the keywords you want. These are the people that you'll be competing with in SEO and you can use a tool like SEMrush to quickly identify which websites overlap with you on how many keywords and which ones. (aff)
Name, Address, Phone (NAP) citations are important for local SEO and ranking in the map box. This means having your main business address listed as your accounts (common practice in the UK) can be very detrimental to your SEO!
There is not such thing as a 'duplicate content penalty'. Unless your site is pure spam, you're not going to be harmed if someone copies a page of yours or if you have some copied content. It may get filtered out of a search result but you're not going to get your site penalised.
URLs are case sensitive. This means search engines will consider: mysite.com/pageone and mysite.com/PageOne as different pages! Stick to lowercase where possible for your main, navigable and indexable URLs to make sharing and ranking easier!
You should not be hiring generalist copywriters to write your content. Competition is fierce and your users (and Google) are looking for genuine expertise and insight - not a rehashed article made from reading 10 others that already exist. Not convinced? It's spelled out for you in Google's webmaster advice:
There is a difference between an algorithm update and a penalty. If you lose a lot of traffic or rankings because of an algorithm update, this is not a penalty and there may not be anything you can "fix". Google is simply evaluating in a slightly new way, to closer match their goals. We've done a deep dive into Google penalties on this week's Search with Candour podcast that you can listen to here.
Links to your site from posts on platforms like Facebook and LinkedIn do not help your ranking in Google.
Patents are a good way get an idea of what might be happening behind the scenes when you are observing results. It is worth keeping in mind, just because Google has a patent for something, it's not necessarily used in ranking - but it gives you a good idea of what is coming. For instance, here is a patent that builds on Google's Knowledge Graph / entity model, that tries to attach internet sentiment about that entity as a ranking factor. In essence, to start promoting web results for businesses that people have had good experiences with. Full patent for the geeky.
Paying for Google Ads does not improve your organic Google ranking. I had someone tell it does yesterday. It doesn't. It really, really doesn't.
If you're using the Lighthouse Chrome extension to audit a site, make sure you launch it within an incognito tab as other extensions can impact results. To do this, there is a switch to 'allow in incognito' within Chrome.
Do not claim your Google My Business short URL for the moment! There is a current issue Google is experiencing that is causing Google My Business pages to vanish, as if suspended in many cases when these URLs are claimed. (I think you were affected also Taylor Gathercole?) Technically, it's not suspended - it is due to CID syncing, but same end effect for the user!
If you're building a new site, SEO considerations need to happen right at the start. How will you handle the migration? What schema are you using? Which content is evergreen and which is chronological? How are you going to avoid cannibalisation? It's not a plan to think you can "do the SEO" after the site is built.
Quick and dirty keyword cannibalisation check, use this search in Google: site:yoursite.com intitle:"key phrase to rank for" This will only return results for your website where you have the key phrase you want to rank for in the title. If this returns multiple pages, you may be confusing search engines as to which page you want to rank for this term. Consider consolidating, redirecting or canonicalising as appropriate.
Technical one, so hold your breath! If you see a Google SERP experiment (new type of result), it is possible to share these with other people by finding and manually setting the NID cookie. Here is a tutorial on how to do this.
Straight from Google, "Pages blocked by robots.txt cannot be crawled by Googlebot. However, if a disallowed page has links pointing to it, Google can determine that it is worth being indexed despite not being able to crawl the page."
If you're running A/B tests, do not use the noindex tag on your variant pages. As I put in unsolicited #SEO tip number 29, use a rel=canonical tag. This also means you'll pick up the benefit of any links your variant pages get. This is 100% what Google advises too.
Almost 25% of all SERPs have a featured snippet, if you're not tracking them - what are you doing? You can use tools such as SEMrush to keep tabs on the types of SERP features that are appearing in your niche.
Moving pages or migrating domain? The best way to handle redirects in 2019 is on the 'Edge'. This means setting the redirect through a service like Cloudflare.
It is possible to update page contents, such as page title with Javascript. You will have to accept that initially, Google will index the non-javascript page title and it may take a few weeks before they process and index the javascript version. This is worth keeping in mind if you have any intention of getting a site's indexed and ranking quickly.
Less is more when it comes to Local SEO and Google My Business categories. Fewer, more specific business categorisations will get you better results that trying to cover everything.
30% increase in organic traffic after 1 new piece of 'pillar' content attracts links. The internet is not short of quantity of content, it's short on quality. My first ever unsolicited #SEO tip was about how the churn of '500 word blog posts' is normally a waste of time. A great first win for a new SEO client, we got them to invest far more in one bit of content, rather than spread it out. The result - links, coverage and after a year of churning out content and seeing no real results, a 30% uplift in organic traffic. Baby steps that will continue to sustained growth now we have buy-in! If you're on a tight content budget - do less, but do it better!
Remember to NoIndex your dev and staging environments so they aren't exposed like these sites and potentially mess up your live site rankings. (Bonus tip: Remember to make it indexable when it goes live, just embarrassing otherwise!)
The hreflang tag can be used cross-domain! So if you've got your .co.uk your .com.au or even another language site, you can help Google understand the relationship and have them rank better! 🌍
Google use contained programs in a framework called 'Twiddlers' which re-rank results. Twiddlers focus on an individual metric, work in isolation of each other and try and improve SERP quality. This means a factor might not be in the 'core algorithm' but it might be affected by a conditionally run Twiddler that alters the result. For example, there is a Twidder that runs on YouTube queries that will improve the result of a matching video, if that channel as many videos that are also a close match to the search (implying the channel is specialist).
If you're planning on doing a site migration, don't fall for setting an arbitrary deadline or timeframe. Take a look at your Analytics data and plan it so you can do launch in your quietest period. This will help minimise any traffic losses you are likely to take.
Getting a domain with a backlink history can really help give you a kick start. I hear Rob Kerry is launching a tool soon that will make this a lot easier.
Server-side rendering (SSR) is important, you should not be leaving it to Google to try and process critical Javascript. This is a great example from Pedro Dias - "This is not a penalty. This is a website that migrated to a client-side rendered JS framework. Even with JS rendering on, most tools fail to crawl past 100 URLs."
It's really great to learn about information retrieval and how search works but TF-IDF and 'LSI keywords' are not strategies. You can't "use TF-IDF to rank better". If you're focusing on these things, you're missing the bigger picture and there will be better areas you can put your time. That and people from Google might make fun at your expense :-)
If you know you have backlinks that break Google's Webmaster Guidelines (or you have received a manual penalty) you can submit a disavow file, listing the links and domains you would like to tell Google to ignore. We covered this previously in a podcast episode.
Links submitted to a disavow file are, according to Google, are only disavowed while listed. If you make a mistake, you can remove them from the disavow file and resubmit it.
It's common knowledge Google now looks as UX components (site speed, mobile friendliness, no popups etc) for ranking. My tip here is do not approach UX by starting looking at these sporadic and individual metrics as you'll miss not only the bigger picture - but the bigger benefit. I got the chance to sit down last week with UX expert Tom Haczewski from The User Story to have a discussion about how to approach User Experience with an SEO context for episode 26 of the Search with Candour podcast.
Google today has announced support for other attributes to identify the types of link in their link graph, "sponsored" for paid links or adverts and "ugc" for user generated content links.
I've noticed for a couple of years that some nofollow links seem to provide a consistent ranking improvement (especially in local packs). Google's most recent update that they might use nofollow as ranking hints is a much more sure sign this is true. In short: Don't avoid nofollow links.
You can provide Google with 'directives' and 'hints'. A directive is something that Google will always obey, such as a 'noindex' tag. A hint is something Google will take under consideration, but may choose to ignore, such as a 'canonical' tag. Google's documentation will specify which tags it takes as hints and which as directives. Here is an experiment that shows how Google can ignore a 'hint' if it believes you are being deceptive.
The Google Quality Rater Guidelines is a 167 page document that specifies how the 10,000 Google manual reviewers rate websites. It is worth noting that the actions of manual reviewers do not directly impact rankings - they are used to test how Google's algorithm is performing. It's a really worthwhile read if you work in SEO, as it lets you know what Google is aiming to achieve.
A homepage is not 'special' in that it has more power to rank well than any other page, in isolation it has the same ability to rank as any other page on your site. Homepages tend to just pickup the majority of links, so they can rank easily - that's all!
If you're a small business, it is likely that an SEO audit will have almost no measurable value for you unless: a) You have the resources to deploy changes recommended b) You are going to invest in a sustained SEO effort Generally, a technical audit will only have immediate impact if the site is deploying it's ranking potential inefficiently - i.e. the site already has a decent backlink profile. For many small businesses with almost no links, making technical changes will have limited impact in isolation. Think of it as tuning the engine that has no fuel.
Intent trumps length with content. Content length is not a ranking factor. Yes there are some correlations, largely due to longer content normally has a lot more effort put into it, earns more links and you've got a bigger net to catch longtail searches with - but please, don't make it longer for the sake of it!
This may not be a popular one and it's one just from experience (and got a few dozen answers from SEO experts on my Twitter). If a company is offering a Gold, Silver, Bronze package type approach to SEO, it's likely going to be hot trash. There are some great comments in the Twitter thread on other's thoughts on this - a few detractors, but generally strong agreement. It's definitely become a red flag to me over the years.
Because so many people have not heard of it, I want to give a shout out to the AnswerThePublic[dot]com research tool. Whack in a subject and it will show you the common questions that are being searched for in Google about that topic. It's a brilliant way to start building topic lists for your content plan.
If you haven't already, I do believe it is time to double-down on making the best of your Google My Business Reviews. Google has killed the ability for you to show in-search stars for your own company reviews that you control, now. The only options left are Google My Business and third-party websites. If you want to learn more, I spoke about this briefly in episode 28 of the Search with Candour podcast.
Google can sometimes ignore your meta description and use any on-page content it finds that it believes is more relevant for the user. This is usually a good thing - dynamic meta descriptions can in some cases give better CTR.
Google has launched some new options that allow you to control how your website snippets are displayed in search. The changes go live in October and full details are here.
You can see how much search traffic you get from Google Images by going into Google Search Console, selecting Performance -> Search Results and changing the 'Search Type' filter to 'Images'. There's a huge amount of traffic potential locked up in Google images!
The 'user end' of indexing, caching, ranking, site:, cache: and Google Search Console are all separate, independent systems. For instance: It is possible to have a URL rank for a keyword that Google says does not exist with site: search It is possible for a URL to be present in cache: and for Google Search Console to say it is not indexed It is possible for a URL to rank for a keyword but not a search for its own URL This means: Don't jump to worrying if you see something that looks off, Google's infrastructure is huge and different systems cannot always be in sync.
Content pruning' seems to be a new fad in SEO, the logic of improving 'overall site quality' by deleting pages that aren't being visited. This is a gross over-simplification and usually isn't the right approach. Look at your content in context to competitors, intent, searches, time of year etc. There are lots of alternative and sometimes more productive routes, such as: - Merging a page without another similar page where the topics are so closely related it makes sense. - Rewriting/improving the page - Checking you are targeting the right words matching user intent for that content - Showing the answer is another way e.g. image, video - what is best for the user. All things I've covered in previous tips!
Google has confirmed internal rel="nofollow" links will always be treated as nofollow, in reference to their recent update where they said rel="nofollow" maybe treated as a hint. This means they can still be useful in some situations for handling things like internal faceted navigation.
This tip is directly from Google for helping you choose an SEO or an SEO agency: "If they guarantee you that their changes will give you first place in search results, find someone else." Monitoring individual keywords isn't really the best measure of success, nobody can account for future algorithm changes or what your competitors will do if you start to climb. Like with many things in business and life, if it sounds too good to be true, it probably is.
Your participation (or not) in providing Google with 'rich snippet' (position 0) results, does not impact your 'normal' 1-10 rankings.
Google is not king everywhere! If you're targeting Russian speaking countries, you will need to rank in Yandex, which can be quite different from ranking in Google!
If you have a site that is easily accessible to search engines, well thought out in terms of navigation and internal links, there really is no need for an internal HTML sitemap. I discussed this in more detail in episode 30 of the Search with Candour podcast.
When doing backlink analysis, it's good to use a selection of tools to get the best picture of your profile. My favourite is Majestic, but I often combine it with data from Google Search Console, Moz, SEMrush and Ahrefs.
Google ignores anything after a hash (#) in your URLs. This means you should not use # in URLs to load new content (jumping to anchor points is fine).
If you're auditing a site, you need to make sure you crawl it both with and without Javascript and with different user-agents and compare differences. I've seen too many site audits done assuming there was no user-agent sniffing or that there was a JS fallback in place!
There is a danger of applying thinking from Google's Quality Rater Guidelines literally to 'how the algorithm works'. Example: The QRG asks the rater to check for Wikipedia articles on an entity, as it is "a good source of information about companies and organisations." This does not mean this is how the algorithm works. It means that for humans, checking Wikipedia is a reasonable way to litmus test for authority. It would not be reasonable to give humans a print out of the finely weighted, thousands of variables a machine would look at. The human can do a thing a machine can't in many weighs, so it's a check and balance for an algorithm.
Make sure redirects go to the canonical version of a page. A common mistake is redirects (a directive) going to a page, which then has a canonical tag telling Google (a hint) that another page is the canonical version.
You generally should use the NoIndex tag on your internal search pages. It's a poor UX for a user to go from a search page to a search page.
Simple one, but came in very handy for someone on my SEO course today! Try using the site: operator in Google to see what pages you have indexed. You do this just by doing a search for site:yoursite.com. Sometimes you'll be surprised by what is (or not!) indexed!
A blog is normally a terrible place to host 'evergreen' content such as how to guides. If your blog/news section is chronological, the content will slowly 'sink' down your site's hierarchy. It becomes more clicks away, harder to find for users and will become seen as a less important page for search engines.
If you're doing outreach and trying to links from newspapers and don't get a response - you don't have to give up there. Most newspapers will have multiple journalists covering similar topics, so it's always worth trying another contact - it certainly beats endless follow ups to one person that just annoys them!
If you're starting out in SEO, I would invest more time learning about how search engines work and what they are trying to achieve, rather than specific "SEO tactics". Learning the foundations will give you a solid framework to make much better strategic decisions. This means reading less "10 ways to.." and more on subjects like Information Retrieval (IR).
Did a competitor copy your content and Google screwed up and is ranking it and not yours? There is a simple DMCA form you can complete that notifies Google. I've had very good/fast success - it will remove the offender's result and get you back to where you deserve. We discussed this in episode 30 of the Search With Candour podcast.
It's a long one, so buckle in! With product sites, it almost always beneficial to users and search engines to have a "view all" page. There is almost no reason why you wouldn't. The most common excuse is "we have too many products". That's not a good reason. Firstly, there are easy tech solutions to this, such as lazy loading - secondly, you're just side-stepping two other bigger issues: 1) The combined latency of clicking through multiple pages is greater than a single page. You're actually providing a slower overall experience. 2) Secondly, do your users actually want/need to look through 7,349 products? Is that useful? Is there not some initial filtering you can do? Do many people really click through to page 56 on your paginated results? Lots of research to show users prefer "view all" and it's better for our robot friends.
Unsolicitied #SEO tip: If you're doing a crawl of your site and you're getting things like HTTP 504: Timeout errors, then your site is probably also timing out for Google - that's bad. In 2019 there is no reason why your site should buckle under the (very low) weight of having a crawl. There is no reason that most sites should not be run via a service like Cloudflare.
There's a Google algorithm update called 'BERT' you're about to be flooded with 'ultimate guides' to imminently. Let me save you some time: With this kind of update, there is nothing you can do to "optimise for it" the same way you cannot "optimise for RankBrain". Write better content for your users, focus on satisfying intent and maximising user experience. There is no 'SEO copywriting', there is just good content, bad content and the stuff in-between.
Make sure your broken pages (404) actually return a 404: Not Found header. I've seen many sites say "404: Not Found" yet they return a "200 OK" header. This is known as a 'soft 404'. "Returning a success code, rather than 404/410 (not found), is a bad practice. A success code tells search engines that there’s a real page at that URL. As a result, the page may be listed in search results, and search engines will continue trying to crawl that non-existent URL instead of spending time crawling your real pages."
Redirect chains can cause issues. There is no reason why internal links on your site should be redirecting. If an internal link redirects, simply update the link to point to the end location.
It's generally accepted sub-folders perform better than sub-domains, sharing in the main domain's equity. If you're told "you need" to put part of your site on a subdomain, it is possible to display it to the user as a subfolder using a reverse-proxy. Easy way around a lot of technical issues and clears up SEO ones at the same time!
Google employing BERT into its algorithm is only affecting US results currently, so if you have seen any recent changes, it is likely not that. An interesting side note: Bing has been using BERT (Transformers) for ages now!
All other things being equal, two links from two different domains are worth more than two links from the same domain.
Really want to call yourself an 'SEO copywriter'? Did you know it's possible to query Google's Natural Language API with your (and your competitor's) content to see how Google is understanding entities and topics? It can be a great way to compare and highlight missed opportunities. Here is a great guide by JR Oakes.
Using anchor jump points within your content gives users a great way to quickly find the information they are after -but you can also get additional links within the SERP which can help clickthrough rate.
SEO is not just about Google! Bing gets some great traffic for B2B searches, especially for organisations where the IT is locked down and they may be forced into using older browsers that default to Bing.
It's here! You can now try our free new keyword research tool AlsoAsked.com. This tool will mine Google in real-time for "people also asked" questions and show you how to group these into topics. We built this tool with Go after seeing the brilliant Python CLI version made by Alessio Nittoli, but realising most people are not comfortable running command line tools and installing packages. Enjoy!
If you are trying to use a desktop tool such as Screaming Frog on large sites, remember it is not always necessary to crawl every single page to find the main technical issues. Problems usually exist over templates, so auditing a sample will give you insight into what needs to be fixed quickly.
Technical SEO audit actions need to be prioritised. One of the factors you should consider is the difficulty and cost of implementing a change. It's a waste of everyone's time to try and push for a change with negligible impact if it means battling significant technical debt to achieve it, there will be other things you can focus on.
Things like robots.txt, noindex, redirects and server response codes can usually be changed by many parties and can quickly change how you're ranking. If you're working in a larger company, with dev partners or content teams, it is always a good idea as an SEO to use a tool to monitor for these types of changes so you can step in and save the day, rather than pick up the pieces!
If you're working with an SEO or an agency, their on-going focus should be on actions that drive results. A 'monthly audit' is not a thing. Audits are a great place to start - they identify issues, gaps, opportunities, help define strategy and roadmaps, but within a reasonable timeframe they are finite - and of course, as the last tip highlighted will be constrained by what is possible. Ongoing monitoring, especially on large sites is important - but it can be automated. Reporting and benchmarking is important, but again, it can mostly be automated. My opinion is, if someone is trying to sell you a 'monthly audit', there is a good chance it will not represent good value.
There are lots of reasons one person may get a different search result to another, but the impact of personalisation is generally over-stated. While things like geography and device may alter a search result, there is actually very little personalisation. Personalisation of search results normally happen in clusters, such as when Google works out your doing a few related searches in a row. Apart from that, in the organic results there is actually very little that is "personal" to you.
Don't let a 'zero' monthly search volume put you off producing content. Over 90% of the key phrases in ahrefs database have <20 searches per month, this is what the longtail is. The important bit is this though: You're only getting volume for that exact phrase. If you write the content well, there's a few hundred variations on most phrases that suddenly make it a lot more appealing - none of which will initially look appealing through volume data.
Clicking 'view source' won't actually show you what's going on with the page if Javascript is used to render the DOM. There's a great Chrome extension called 'view rendered source' which allows you to compare the rendered and non-rendered source. Really handy for making quick progress on technical SEO audits!
The quality of links is far more important than the volume. This is why setting targets on volumes of links doesn't tend to work so well and can be a very outdated approach to SEO.
"So what?". It's a great test for when you're producing content in an effort to get coverage and links. As ideas are formed and developed, it can be easy to get off-track and sold on your own idea. When you've got your story, data, headline - ask yourself, "So what?". Why would other people care? If you've got an answer to that, can you move onto the next stage.
Google Suggest data, such as that mined by AnswerThePublic can give you incredible insights into the intent behind a search. For instance, take the screenshot below for a few suggestions on "Shard height" as a search term and consider, even from this tiny slice of information we can learn: > People are searching in feet and with the spelling 'meters', this gives a strong hint many searchers may be American. > Shard vs Eiffel tower: Perhaps metres/feet don't mean a lot to some people. This tells us a picture comparison may be the perfect way to answer this query, allowing people to compare the height of something they know. > Using these two bits of information, it would suggest that a visual comparison to well known American and other European buildings would be a good area to explore. Three really important insights from just two of hundreds of data points!
Little tech tip not many people seem to know. If you have an ecommerce site and you want rich results but can't get schema on your site, you can actually achieve this through submitting a product data feed in Google Merchant Centre. You don't actually need to spend any money on ads and Google can use that structured data to enhance your search result.
Google ignores crawl-delay specified in robots.txt, but you can change this within Google Search Console. Keep in mind, if you're having to do this - it means you've probably got bigger infrastructure problems: There shouldn't be a case where Googlebot is causing your site to creak in 2019!
Edit: As of Jan 2021, crawl anomalies have been removed from GSC Make sure you get out the 'crawl anomalies' report in Google Search Console under 'Coverage'. While GSC will quickly report 404 and 5xx errors, crawl anomalies often gets overlooked but can expose serious issues such as frequent timeouts that are preventing indexing and ranking.
Canonical tags must be declared in the head, putting them in the body means they will be ignored. This can be very bad.
If you're trying to measure organic performance, especially at this time of year, you need to look at Year on Year (YoY) figures. If you're running an e-commerce site and you've seen an organic uplift in clicks/impressions over the last 30-90 days compared to the previous 30-90 days, this should not come as a surprise - it's coming to Black Friday, Cyber Monday and Christmas shopping season! If you really need to do this type of analysis, use Google Trends data to normalise your traffic to see if there is uplift after adjustment.
It's that time of year when you'll be doing sales, so make sure if you've got products or categories in special /sale/ URLs that you have canonical tags set up to the original pages and you 301 redirect them once the sale is over instead of killing the pages - sales attract links!
It's a good idea to implement http -> https 301 redirects before HSTS to make sure everything is working. You can leave the 301s as a fallback, although Googlebot will treat them as 301s, not all search engines are the same.
Sometimes your target key phrases can be close to impenetrable. Rather than waste resource trying to get positions with no return, it can be worth considering alternate (normally lower search volume) phrases for the same intent. A smaller slice is better than no slice at all! You can make this judgement on Google Trends, search volumes and cost per click data.
Search intent shift is when at certain times of the year, the majority of the intent behind an individual search will change. A good example is 'Halloween', which switches from informational to a commercial term near halloween. As this happens, Google will can change their rankings drastically to adapt to this intent. If you do see fluctuations in your rankings around specific events or times of year like this, it will may well be that there is nothing "wrong" with your site, it's just not the best to serve that intent, at that time.
If you're dealing with a site that has input from multiple teams, content people, product teams, external devs, I would strongly recommend some kind of ongoing cloud monitoring/crawling of the site - it can really save your skin. Yesterday, we detected a problem that got through a dev agency's QA that made the whole site appear blank to Google. On a site doing over £1M in revenue a month, you can imagine the fallout this would have caused if it wasn't spotted before it could do damage!
When setting up keyword tracking, it is useful to use multiple tags on keywords. This allows you to view how well you rank for a specific topic, product or service. This information can help form your SEO strategy, determining what you need to do around specific topics to get rankings. For instance, if Google 'likes' your site for a specific topic, just building new content means you'll likely rank for it off the bat. If you're struggling on a topic, you'll need to gain more authority, which a lot of the time comes down to getting links.
Consider whether ranking factors are query or non-query dependent when making strategic SEO decisions. For example, PageRank is a query-independent ranking factor, it applies the same to all sites. Content freshness is a query-dependent ranking factor, some queries deserve freshness, others it does not matter. This means: Internal linking is important on all sites, content freshness may not be.
Buyer beware: Despite some spurious claims to the contrary, there is no tool on the market that can tell you specifically what organic key phrase an individual user searched for and then clicked on. Only Google has that data and it doesn't give it out.
As tempting as it may be, I would not alter page titles if possible during a site migration. It's helpful to change as few things as possible to get a clearer view of if there are issues. It's likely there will be opportunities you can take advantage of, but doing them in a linear fashion will usually provide the clearest route to success.
Don't be afraid to link out to other good sites when it is helpful, but linking out does not directly help boost your ranking.
I find it shocking they still exist, but I guess it's because I am in the SEO bubble - but there are still large companies out there offering 'SEO ad platforms', which are basically adverts or advertorials that 'pass SEO benefit'. Avoid, avoid, avoid. They will put your website at a substantial risk of getting a penalty. While it's up to you if you want to follow Google's rules, if you're going to break them - at least do it well!
If you really want to show users something and not search engines, it's worth keeping in mind that Google ignores everything after a '#' in a URL... Has some "interesting" uses :-)
It can be hard to xtract insights from Google Search Console can be hard. Luckily, Hannah Rampton (Butler) made a free tool to generate insights quickly from Google Search Console.
If you're unsure of where to start with link building, take one of your competitors that ranks #1 and put their website into the Majestic link tool and click 'backlinks'. This will give you a list of all of their best links and you'll be able to see what strategies they have been using to acquire them. Great and vital bit of initial research to carry out.
It's a good move to start using cloud instead of desktop-based tools for spot checking. For instance, it's very easy to monitor your site for things like broken links, accidental updates to robots.txt or the like with a cloud based tool. It's far too common that on a first audit of a client site, I'll find dozens, sometimes hundreds of broken links. Unhappy users, unhappy bots
You can get more mileage and links from your web content with collation and curation. If you're using a tool like AlsoAsked to answer questions about your products or product categories, you could look at combining all that information into a single/guide page that can be used with outreach efforts - it makes it a lot easier to build resources to get links from.
The pages that you want to rank well for higher volume terms should be linked "high up" within your site's hierarchy, such as the main menu or homepage. If you have a page that you expect to rank well that's 4-5 clicks away and only linked to internally from a couple of pages, you're likely to be disappointed!
An automated SEO audit report done by an online tool has almost zero value unless it's put in context to your business by someone that understands both it and SEO. These tools will rarely give you a good action plan and will almost always provide false positives.
The Lighthouse 'SEO audit' score means pretty much nothing. You can score 100 and still have massive technical problems on your site. The Lighthouse audit will only look at that individual page and it does so with no context to what you want to rank for or your business.
Google is ending support for data-vocabulary on April 6th, 2020. You have until then to switch to schema dot org or you will lose rich snippets. Here is Google's blog post about data vocabulary and switching to schema.
Massive announcement by Google today. If you have a featured snippet ('position 0' result) that page will no longer rank on page 1 of the search results. So if you see some big shifts/drops, this could be why!
Want to get rid of your featured snippet so you can appear in the 'normal' results, but worried where you might rank if you remove the snippet? You can find out! If you're ranking for a snippet, you can add the &num=9 parameter on the end of your Google search to remove it and show where you would rank without it! If you're ranking well, you can then use the nosnippet tag on then page to stop Google showing a featured snippet.
Google does not "favour" long content. To be a little bit more specific: Long content has a higher probability of being better content and ranking better, but the actual word count does not matter. This means that targets like "this page must be 500 words" or "this article needs to be 1,000" are utterly meaningless. There is no logical reason why word count would be a ranking factor. Google has told us on multiple occasions, specifically, that word count is not a ranking factor - any decent SEO should tell you the same. If your SEO tells you word count is important, make sure you sack them with at least a 500 word email.
Answering peoples' questions can be key to creating good content. I mentioned in the last tip that content length is not important, but is usually correlated with better content. That means understanding intent and approaching it from a topic, rather than a keyword point of view. To this end, we made AlsoAsked - it allows you to explore the (People Also Asked) questions that Google generates and understand how they are grouped.
As I've mentioned in other tips. Automated 'free' SEO audits are free exactly because they offer very, very little value. It's like an architect showing you the plans for a building. All the information is there, but you're not going to try and build your own house with the knowledge, are you? I'm so confident in this - I'm quite happy to run a free (automated) SEO audit for anyone who leaves their web address in the comments if you want one. Having an SEO expert look at your site, in combination with tools like this, in context to your business, with your validation - with a plan to implement is what gets results!
Google has launched a new removals tool within Google Search Console. This tool does three things (1) lets you temporarily hide URLs from showing in Google search, (2) show you which content is not in Google because it is “outdated content” and (3) shows you which of your URLs were filtered by Google’s SafeSearch adult filter.
Google uses upheld DMCA actions in its ranking algorithm. One more reason to report competitors that are copying/scraping you. If you need to report a competitor, you can do so here.
Google is working hard to understand "what" things are: companies, people, brands and work out how these "entities" are linked in their Knowledge Graph. It's useful to know how you sit in this and Carl Hendy has made a neat tool that requires no programming knowledge to explore the Knowledge Graph.
There is absolutely loads you can be doing with Google Sheets to help your SEO efforts from pulling out insights from the rather basic Google Search Console, to tools that help you detect cannibalisation. I don't know anyone better at Google Sheets than Hannah Rampton (Butler) and she was kind enough to come to #SearchNorwich and talk everyone through the QUERY function, which is the basics of how her tools are made.
The URL Removal Tool within Google Search Console does not do what it says on the tin. It only hides results from Google search results temporarily. This means if you want to permanently remove a result from Google you need to mark that page as "noindex". Remember: robots dot txt does not stop pages getting indexed! More info on Google removals in episode 47 of the Search With Candour podcast.
Some sites will serve a different experience or content depending on your user agent. You can get a Chrome extension that allows you to switch to any user agent, including Googlebot. This is an incredibly useful diagnostic tool when you're dealing with sites that need things like server-side rendering, to audit what is being served to Google.
Google will tend to "like" your site for specific topics. For this reason, when you set up keyword tracking, it is a good idea to use the tagging system available on most of these platforms as it will allow you to slice ranking data by said topics and monitor performance for opportunities.
A simple SEO test for you that a lot of businesses can't do: Choose a key phrase you which to rank for and then simply answer: which one page do I want to send search engine users to when they type this and does the page reflect that query or intent? "Homepage" is normally not the answer and if "it could be this page or this one" then you have some problems to solve!
If you want a rough understanding of how Google currently views other site's content, you can use the cache: command when searching. Try: cache:domain.com/your-page-url/ and you'll see an image of how Google has processed the site. If you see huge chunks of content or navigation missing, they may well have an issue! Note: Google has specific tools for this within GSC for your own site.
A #Shopify hack - the Shopify platform won't let you edit the robots.txt file Edit: June 2021 - They do now!, however, you can create a 301 redirect from /robots.txt to a file you control elsewhere. Google will still pick this up and it will function as a robots.txt!
Links are important and you can get easy links by using a tool like Majestic to find broken incoming links. That means, sites that link to you that currently go to broken pages. You can either get them to update the link (to where it is meant to go) or simply set up a 301 redirect your end. To do this, login to Majestic, enter your domain then go to Pages > Crawl Status > 404 and you'll get a full list of broken incoming links! Easy!
The advantage of 302 (temporary) redirects is that they will preserve the redirected URL in Google's index. This can make things a lot easier if you plan to change it again soon and avoid having to set up multiple redirects.
If your e-commerce site has a faceted navigation/filter that gets you to a product sub-category that has search volume, it is good to practise to make sure this page is accessible by standard links (I.e. not checkboxes) and has its own URL so it can rank!
If you do use an automated audit, be aware they don't account for the size of your site, which will often dictate the magnitude of the problem. As an analogy, if you have a leaky tap in your bathroom, this isn't a huge problem - but if your house has 500,000 bathrooms and there is a leaky tap in every one of them, it's a big problem!
When you're getting an SEO site audit, generally the larger the site is, the more value technical fixes will hold - and the smaller the site is, generally content suggestions will carry more value. Here is a beautiful graph to demonstrate this.
It is absolutely possible to break Google's webmaster guidelines and trick it into ranking your site better than it deserves. However, all blackhat SEO is temporary. You're playing in the gap between the algorithm, ideal and the technological capability to enforce it. That gap is always shrinking, so you'll eventually get caught - so plan for that day!
If your Screaming Frog crawls as Googlebot are being blocked (server doing reverse IP check), you can usually get around this by setting a custom HTTP Header X-Forwarded-For with a known Google IP. Here is a video to demonstrate how to do this.
It appears from the latest messages from Google that their mobile-first indexing rollout is going to be complete by the end of the year. So if you haven't already (where have you been?), it's a good time to check that your content and UX on mobile is matching up to your desktop experience!
Always use multiple tools to confirm data when SEO auditing. In the last week, I've found 3 bugs in some of the most used SEO tools which meant the data they were giving was completely wrong! Verify and don't take things for granted!
As of this week, Google is now treating "rel=nofollow" as hints, meaning it can choose to ignore nofollows on external sites, as if they were PageRank passing links! If you missed this update, check out episode 27 of the Search With Candour podcast.
Permanently discontinuing products on an e-commerce site? This can be a good general set of rules to think about.
Links on pages that are marked 'noindex' will eventually be treated as 'nofollow' links.
Don't waste time trying to assign value to individual tactics in isolation. To get results, you will need a combination. For instance, good nutrition and regular exercise can make you better at marathon running and their effects compound each other. It is hard to get results using one in isolation!
Optimising images for e-commerce sites is often overlooked. Lots of people start the purchase process by browsing Google Images, rather than the main search. This is combined by the fact that product schema is now shown on some Google Images - huge opportunity there for most retailers.
If you're redirecting old URLs with links (say from a site migration), unless you can get the source link updated you need to keep those redirects in place, don't delete them!
If you're running Screaming Frog and getting Status Code '0', this means the Screaming Frog bot is timing out before the web page responds (not the website or server). The default in Screaming Frog is 20 seconds before it will timeout, which is reasonable - but these errors will not appear in your server log files, so beware!
Internal link optimisation is important, especially on large sites. Sitebulb v3.4 is out today and it includes a new set of features to do some great in-depth analysis on internal links. Normally, this is some pretty laborious work!
This week, Google started showing PDF thumbnails in search. If you have a lot of PDF content, it's more important than ever to start tracking interactions as click-through rates will likely increase. If you're on Wordpress, Carl Hendy has a free plugin that can help you measure PDF usage on your site.
I would always recommend having an external XML sitemap as it provides an easily audible list of indexable/canonical URLs. This makes it much easier to spot when rogue URLs are either linked to or are present in the index and getting traffic.
I think we're going to see a lot of changing behaviour over the next few months of people moving to e-commerce and online wherever possible. We've had a few clients "double-down" on their SEO and I think it's a good time to consider it, especially if competitors are slowing down. Websites are still going to want content, people will spend more time online, the opportunity is larger than ever. I'd expect a lot of treat that with scepticism as I work in SEO, which is to be expected. I'm not posting that to get new business and I hope the last 262 SEO tips I've posted put that in context, I genuinely think it's a unique time. Last 5 years of UK "Search Engine Optimization" topic searches as food for thought:
I use a Chrome extension that automatically highlights nofollow links on any webpage I am visiting. It's really useful to see how others are using it and making sure I don't miss things when auditing sites.
Knowing which SEO advice is a myth and which are misconceptions will save you a huge amount of time in the long run. We interviewed Natalie Mott in episode 57 of the Search With Candour podcast and spoke to her exactly about this subject.
If you have been affected by Coronavirus, you can take a look at the Search Starter pack organised by Dom Hodgson. It provides special free/cheap/extended access to a whole bunch of tools and services.
Search demand and intent is rapidly shifting as people go more online. There is a lot of things to take into consideration when marketing at the moment. We've made a Search with Candour episode all about helping companies with digital marketing and keeping their business going during #COVID-19 - I spent some time interviewing Kevin Indig who shared his thoughts on episode 53 of the Search With Candour podcast.
Hugely important one to local restaurant/cafe type businesses, so please share to let them know! Google My Business has temporarily changed its policy on business names. You can add "delivery available" or "takeout available" to the business name. This will help you rank in the local box for these terms.
When crawling a site with an without Javascript, you may see key opportunities and issues with internal linking. For instance "recommended products" may be powered by Javascript, so these important deep links may not be immediately viewable to crawlers - meaning the subsequent pages won't rank as well as they could do.
Getting an overview of how your competitors are ranking and what they are doing can be invaluable. One of SEMrush's sales hooks lets you generate this report for free now.
When looking at site performance the Chrome User Experience Report (CrUX) is some of the best data you can get. This is real-world user data sent by user's browsers on how long your site is taking to load. If you have enough visitors, you can get this report in your Google Search Console.
If you need to pause your online business during Coronavirus - do not just disable or take down your website! Removing pages will mean they will be dropped from Google and you may struggle to get them to rank properly when you re-open. Here's episode 54 of our podcast where we talk about guidelines on how to pause your website.
If you have to put a pop-up or message on your site about Coronavirus, Google has warned that it needs to follow their guidelines about not being blocking access to content - or your site's rankings could suffer.
If you disable checkouts or mark items as 'out of stock' while pausing your business, make sure any related schema (not visible to users) is also updated. If your schema does not match information on-page, en masse, you could trigger a schema penalty in Google, which would make Google ignore everything - the last thing you need right now!
Keyword research should account for the terms that your customers are using, whether or not they are 'technically correct'. A related example I found today: We are hearing the word 'unprecedented' being used a lot in relation to Coronavirus. Google Trends suggests that there has been a significant amount of people that for whatever reason, have had to Google the meaning of the word. Definitely worth thinking about in a wider marketing and comms sense too!
Google has launched a beta COVID-19 announcement tool focussed on government agencies, official health authorities, schools and the like.
With lots of compounding factors, sometimes it is hard to predict the impact of technical SEO changes on large sites. In these instances, it is a good idea to run SEO A/B testing. Don't know where to get started? Try this guide fom Portent.
Featured snippets are brilliant but they are not as stable as 'traditional' rankings, they can change and rotate very quickly. Google is also currently testing using featured snippets not in the 'number 1' position. When I plan SEO strategies, I always class featured snippets as a 'bonus' rather than as a true ranking achieved.
If you're trying to block resources from being indexed that you can't edit the HTML of, PDFs for example, you can do these by delivering the 'noindex' tag via X-Robots. Don't try and use robots dot txt or nofollows to stop things becoming indexed!
When trying to root out problems on modern web pages, it can sometimes be hard to get a quick and easy overview of where a specific page links to. You can configure Screaming Frog just to look at the links on one page by going to Configuration - > Spider -> Limits and setting crawl depth to "1". Then, enter the specific URL you want to examine in the top bar and you'll get a nice clear list of the resources and links on that single page.
Using the site: query can return pages that are not in Google's index, so it's better to use Google Search Console for this kind of debugging.
While the cache: operator gives a good hint as to what Googlebot has seen, it isn't reliable and there are newer tools for this job. Google recommends using the Mobile-Friendly Test or the URL inspection tool in Search Console. You can see loaded resources, JavaScript console output and exceptions, rendered DOM, and more information by clicking the more information link on the page verdict card.
How much is SEO worth? There are two basic ways to judge this and see how your business model stacks up at the same time. 🤑
(i) For any specific key phrase, you can get PPC data for what the market rate is for that visitor. This is a great way to answer 'how much would it cost us to buy this quantity and quality of traffic?'
(ii) The second is how valuable that traffic is to you. This means, if you get X quantity of visitors for Y keyphrase, what percentage of them do you convert into your desired commercial action and what is that action worth?
If you find that the number in (ii) is way lower than the number in (i) it means that other types of business or the same type of business with another revenue model or revenue efficiency can derive more value from the traffic. This means they will be able to outbid you on PPC and out-invest you on SEO. It's a great bit of analysis to start with.
Real-time monitoring of what your competitors you are ranking for can be useful so they don't get a lead on you with good ideas. You can use a tool like Ahrefs to get an alert when a competitor site starts ranking for a new keyword. If you use Ahrefs you can find it in Alerts > New keywords > Add alert > Enter competitor’s domain > set report frequency > Add
#Shopify is not SEO-friendly "out of the box". One of its quirks is that it will generate multiple Collection URLs for a single product (and a URL without the collection). They try to handle this through canonicalisation, however, most of the internal link structure still points to non-canonical links, which is bad. You can fix this by editing the collection-template.iquid and removing the collection reference from the where hrefs are being generated. Easy.
If Googlebot encounters noindex, it skips rendering and JavaScript execution. This means any Javascript on a noindex page will never be rendered.
If you are using a sitemap to implement your hreflangs, don't forget that each
You'll hear a lot of SEO people talking about E-A-T (Expertise, Authority, Trust) and ranking websites. To be clear, "E-A-T" is not a ranking factor. It is a concept that combines many different metrics and factors to measure these things. For instance, good old-fashioned incoming links to your website can contribute to E-A-T. It's a useful concept but actually nothing particularly new. Keep in mind that how Google decides how to measure E-A-T will evolve as time goes on, too.
You can use Screaming Frog to validate structured data you've added by Javascript by enabling JS rendering (Config > Spider > Rendering) and then checking 'Google Validation' in the Spider Configuration (as in screenshot).
Remember when people talk about "Google's algorithm", it's not just one giant single algorithm. It's made up of lots of different algorithms, some core, some not, some applied back at Google's end, some applied right at the time of search.
How long should you leave redirects in place? Google's official answer is "as long as possible" - at least a year. If possible, forever.
Having an HSTS enabled site means that browsers won't even try and access the non-https version, you'll simply see a "307" redirect (which isn't a true server redirect). This can be tricky if you're trying to diagnose issues. You can actually use Screaming Frog to test what is happening behind the 307 redirects by disabling "Respect HSTS" policy in Configuration > Spider > Advanced
Google does not use data from Google Analytics to index or rank your website. No, they don't use "bounce rate" as a ranking factor, either. If people say they do, that's just their own wild theory.
Great discussion yesterday with Sam Pennington and Dan Wheeler on Google using engagement data. Google absolutely do use click data to judge results of experiments (no specifics on how it impacts individual sites) but below is an amazing talk by senior Google engineer Paul Haahr about the challenges in using click data for even basic decisions. Paul Haahr's talk: https://www.youtube.com/watch?v=iJPu4vHETXw&t=619s
The impact of a "ranking factor" can change massively by industry or niche, even with time of year. Don't take it for granted that things working well for others will necessarily translate into success for your website.
Googlebot's default timeout = 180 seconds.
Descriptive anchor text used internally within your site is not only good for users but it really helps search engines figure out how to rank pages, the impact on big sites is especially noticeable. Sitebulb has a new "Link Explorer tool which allows you to easily view and manage internal anchor text. Make sure you review key pages on your site and that internal links are roughly equating to what you want to rank for and kill off those "click here" and "learn more" anchor texts!
The most reliable way to get an overview of your site's index coverage is to use Google Search Console and look at Coverage -> Valid. The site: command can return wildly varying results and will actually return pages not in Google's index. I discussed this on episode 58 of the Search With Candour podcast.
Google scores pages on both 'query independent' and 'query dependent signals'. Examples of query independent are: PageRank, language, mobile-friendly. Examples of query dependent are: keyword hits, synonyms, promixity.
While not as powerful as they used to be, an Exact Match Domain (EMD) can trigger Google into thinking a search term is a "navigational" query (i.e. the searcher is specifically looking for that site). For this reason, exact match domains still tend to punch above their weight in terms of ranking.
🚨 Blindly following the below flowchart is not a good idea🚨
If you have content that isn't driving traffic or getting links, don't just delete it! There is a whole process you can go through of:
In general, I noindex "tag" pages. Why? Because nobody wants to land on a tag page from search making them conduct another search/filter action to find what they want. They are generally "low quality / thin" pages that don't offer a great user experience. If at all possible, I avoid having them as part of the build.
While it's nice to have one H1 on a page to be clear about the subject, having multiple H1s in a template isn't going to cause you any SEO issues. It is highly likely there will be other things that are more valuable to spend your time on.
When you're internally linking, you should generally try and link to the "canonical" version of a page. If you start linking to non-canonical variants, Google can choose to ignore your canonical tag and start ranking the 'wrong' page.
Make sure all of your important pages are accessible by normal, clickable links. Sounds silly, right? However, relying on functionality such as dropdown boxes as the only way to access pages makes it unlikely search engines will be able to discover them easily, or at all.
Just because a page is crawled and discovered by Google, does not mean it will be indexed and appear in search.
The 'correct' title tag length is not a number of characters. It's simply the length required to uniquely and succinctly describe the page content.
When declaring hreflang tags to deal with internationalisation, they must be reciprocal. That means if the English version of a page says "the Geman version is here", that German version must also have a tag that refers back to the English version. If the relationship is not reciprocal, the hreflang tag will be ignored.
Largest Contentful Paint (LCP) is a new metric in the Lighthouse performance report. It marks the point during page load when the primary–or "largest"–content has loaded and is visible to the user. LCP is an important complement to First Contentful Paint (FCP) which only captures the very beginning of the loading experience. LCP provides a signal to developers about how quickly a user is actually able to see the content of a page. An LCP score below 2.5 seconds is considered 'Good.'
One of the most common mistakes I see on site migrations that causes a loss in search traffic is not redirecting non-canonical URLs. Non-canonical URLs (such as those with tracking parameters in) need redirecting too - especially if they are getting links or traffic!
Do unlinked mentions of your brand help? Google recently replied with "usually not" - but interestingly, Google have a 2014 ranking patent (see image) that describes consideration of unlinked mentions. My opinion is they are mainly used for discovery, entity identification, and perhaps some sentiment analysis. Here's episode 62 of the podcast discussing unlinked mentions.
Your CSS can affect how much PageRank passes through a link.
Google places many SERP verticals (images, news etc) based on their understanding of searcher intent. In my experience, how high up Google shows "People Also Asked" results will give you a good indication to the breadth of intent in a query. For queries where the "People Also Asked" questions are at the top, it means Google is trying to clarify the intent. If you're trying to rank for these terms, this should be reflected in your content.
If your site has an internal search bar, this can be gold for determining if you need to change your keyword targeting or make more pages. See what people are searching for and make sure you have pages that are optimised for this intent.
By using a 410 instead of a 404 you speed up the process of deindexing the page. The difference between a 404 and a 410 status code is that 404 means “Not Found”, while 410 stands for “Gone”. The signal 410 “Gone” sends is much stronger for search engines and makes it clear that you removed the URL on purpose. That’s why they’ll deindex it faster. via ContentKing
The svg format is supported in schema for things such as LocalBusiness Image :-)
We now know that the new 'web vitals' metrics, apart from being important for users are going to be ranking signals in 2021. There is a Chrome extension that allows you to see all 3 web vitals on any page you land on - and it gives you an easy traffic light system! Listen to episode 62 of the podcast on the new web vitals.
If you want Google to quickly index a change on a page such as an optimised title or correcting a mistake, you can use Google Search Console. Go to URL Inspection -> Enter URL -> Request Indexing and you'll be put in a queue to have a visit from Google ASAP!
PPC and SEO should work together in so many ways. If you have PPC data then use it for your SEO efforts! Buying traffic for keywords can be a huge benefit, allowing you to "scout them out" and see if they convert before you put huge amounts of effort into ranking for them only to find out they don't convert for you!
Loads of chat about nofollow and guest posts recently, so I will share my experience as a tip. The truth is, if you're smart with it, guest posting (even paid) will definitely get sites to rank. I have affiliate sites ranking from pretty much nothing but paid guest posts (gasp, don't tell on me!). However, I wouldn't be using this as a tactic for a 'real' business that needs to be about in the long-term. The net is closing and there are better things to invest in for the long-term and that is what SEO is - a long-term game. The difference is, if an affiliate site gets a penalty or drops in rankings, it's easy to just take the money and move on, certainly not the same with most businesses and brands with equity!
CLS is "Cumulative Layout Shift" and is a metric Google has picked to measure good user experience. CLS measures how much a page "moves around" as it loads. We've all seen it, as a page loads we go to click on something and then it suddenly moves! In 2021, CLS will be incorporated as a ranking factor - meaning if your site does this, it will negatively affect your rankings.
Where you have the same content on two different URLs and the different URL does not serve a purpose (it's not a parameter that sorts or filters, for example) you should be using a 301 redirect, not a canonical tag to combine them. As an example, if you have both the www and non-www version of your site, pick one and use a 301 permanent redirect to it. I saw an example yesterday where both versions were accessible but the SEO had opted to implement canonical tags instead of redirects. Canonical tags are hints and can be ignored, especially with inconsistencies with links that you can't control!
In Magento, the "Auto-redirect to Base URL" option will automatically redirect users to your base URL (e.g. from www to the non-www) - however! The default redirect setup is a 302 (temporary) redirect, make sure you change the value to a 301 (permanent) redirect!
If you're dealing with a large site, trying to optimise titles and meta descriptions manually is going to be inefficient and give low return for your efforts. At the very least, I would start with an 'optimised' template that can be automatically generated. You can either focus on optimising the few most important pages manually or on other higher priority tasks. There are also some really cool options to use trained models to generate good meta descriptions now!
Inlinks is a tool I have started experimenting with made by Dixon Jones. One thing it does that I really like is that it will identify all of the 'entities' within your content and compare this to the content and entities of the top-ranking pages in your niche. It's a really fast and objective way to see how your topic coverage aligns with competitors and ultimately user expectations. More info: https://inlinks.net/?fpr=mark30
A basic for business owners that I see most businesses miss. There is a "Posts" feature within Google My Business that allows you to post COVID-19 support/updates, offers, updates and events directly to the SERP. It's an easy way to control the search result, expand the real estate you are taking up and control the message you want to deliver at that moment!
Getting hreflang tags correct can be tricky. If you're using on-page hreflang tags, you can audit them with Screaming Frog by selecting ‘Crawl’ and ‘Store’ Hreflang under ‘Config > Spider > Crawl’. This will help you quickly identify where you have issues!
Now Google has been linking directly to text fragments within featured snippets, it's possible to setup tracking in Google Analytics to see if clicks are coming from FS and where on the page they are going to! Brilliant idea that Brodie Clark wrote up after discussions with a handful of other SEOs.
This is the anatomy of a web address. All parts of it can have an impact on ranking. To name a few: Protocol: Google prefers secure vs non-secure Sub-domain: I won't start the sub-domain vs sub-folder debate 😬 Google claim they are treated equally, but I have seen "confusion" many times. Domain name: Exact match domains can trigger Google into thinking a generic search term has navigation intent, so EMDs can punch above their weight sometimes. Top-level domain: Many factors, such as ccTLDs (country-code TLDs) can make it easier to rank in a specific geographic location.
Keyword Difficulty (KD) is a proprietary metric that is calculated differently by many different tool vendors, so be very careful if you are going to make any decisions based on this metric! Here is a fantastic example that Chris Ridley shared earlier in the week where SEMrush and Ahrefs rated the same keyword as "Very Hard" and "Very Easy". Personally, I completely ignore KD metrics when making SEO plans.
We've covered how #Shopify has the bad habit of linking internally to non-canonical URLs and that canonical tags are only a hint, so this can lead to problems. How to demonstrate that? Sitebulb has an awesome feature when you connect it to Google Analytics - it can report on when canoncalised (i.e. the wrong URL) are receiving organic traffic. It's an objective way to demonstrate that the internal linking is causing problems with ranking!
Google will index and rank content that is hidden, such as in tabs absolutely fine. The only difference is that content that is immediately hidden when a page load won't be shown in snippets by Google. Google used to specifically tell us that content hidden behind things like tabs would be de-valued as the logic was "it can't be that important" if immediately hidden from the user. Since the Google mobile-first update they have been very clear to say this is no longer the case. In a mobile-first world, Google "realises that screen real estate is at a premium" - meaning the best user experiences will sometimes include content that is in the first instance hidden. Despite this the myth has stuck around - even this year in April John Mueller was working hard to dispel it. It really isn't something you need to worry about! Google confirming April 2020 that hidden content is not de-valued.
As long as your code functions, it does not need to be W3C compliant - this is not a "ranking factor" as you see on some audits. To quote Google: "As long as it can be rendered and SD extracted: validation pretty much doesn’t matter."
Regex (regular expression) is now supported in Google Search Console. This means you can get much quicker and more specific data pulled on things such as brand vs non-brand search queries 🎉🎉
Big site? You'll get more pages indexed with lots of small sitemaps, rather than 50k URL sitemaps. Don't know why, it shouldn't be that way, that just appears to be the case. Here's some more info on the alternate approach to xml sitemaps.
"Did the user adjust or reformulate their query?" is something Bing specifically listed as a ranking factor and I have a high degree of confidence Google does the same. Reformulating a query gives a strong signal that the searcher's results do not match intent. Therefore, especially on "broader" search terms it is useful to have a first-mover advantage, looking at "people also searched" and "people also asked" data to try and see how you can improve your content to catch the formulations - and most likely - in the longterm, the ranking for the original query, which will likely change.
If you get a manual action for links, remember that when you have this penalty "removed" you will not go back to the position you were in previously. This is because the links that you had, which were deemed manipulative will either have to be removed or discounted. This means that trust, popularity, link equity, whatever you want to call it - will need to be replaced. This is another reason why using manipulative link building is not to be seen as an "investment" like a lot of SEO is.
Domains with existing links that have been there for a while are hugely valuable. This is going to sound basic - but make sure you have some kind of alert setup for when your domain is going to expire. Just this week, we saw Google lose one of their Blogspot domains to a dropcatcher - it will be ransomed back to them at a huge price because they don't have a trademark. Don't let this happen to you! Little Warden can help monitor your domain to avoid nasty surprises.
If a newspaper is trying to sell you online advertorial links "with SEO value" then they are potentially putting you at risk. This question came up on today's webinar: "Are backlinks from sponsored content (eg we pay for editorial piece in online newspaper that says sponsored on top of article and in url) Worth less than backlinks from an organic piece (that didn’t have sponsored on article heading)?" Sponsored links should be marked with "nofollow" tags (or rel="sponsored") and not pass any PageRank. Of course, if you want to break Google's Guidelines, that is fine - but having followed links on a "SPONSORED" section of a newspaper is an easy way to at best, get those links discounted.
Google still uses "non-supported" schema types to understand page content better. So if you're adding schema to your page it is best to do a "full" job, rather than just using the list of schema where Google can generate a special result. Here is a video confirmation of this from Google's John Mueller.. (h/t to Shay Ohayon for making me aware of this!)
Don't disavow links just because they have low Domain Authority (DA) or Trust Flow (TF), etc.
Using an estimated global "click-through rate" is not that helpful because depending on the search intention, and therefore SERP layout, the CTR for position no1 in Google will vary between 13.7% and 46.9%* according to study by SISTRIX.
If you're building a new site, you should be getting SEO specialists involved right at the start. You can get so much additional value from mitigating potential issues (prevention is cheaper than cure) and spotting additional opportunities such as schema or things you didn't consider from keyword research. SEO is not something you "do at the end", it is a continual process.
"SEO is an art, not a science" is a really interesting quote from John Mueller from Google from episode 3 of their 'Search Off The Record Podcast'. John is referring to the way he likes to think of Googe ranking pages (which he states isn't a model forhow it's happening) as a neural network, meaning there are many "paths" that you can take to get the same outcome. This means a successful combination of factors for one site, replicated for another site, may not produce the same result and/or there are multiple paths, through multiple combinations of factors that can produce the same result. The important takeaway is that things that might be considered a "ranking factor" are not always possible to analyse in insolation in regard to "how big of a ranking factor is that?". A lot of the time, a successful SEO strategy will be just doing "what is right" for your site in a big picture view, rather than trying to justify, quantify and cost a specific factor.
It's important to accept that all "link building" carries some kind of risk. The only way you can guarantee links is if you place or pay for them, which carries a risk for breaking Google's guidelines - or - at the other end of the scale, you create content and do outreach, which is not guaranteed to get you links, so you risk wasting time and effort, there are no guarantees in marketing!
Whether a link is 'counted' by Google is a decision made in context to the rest of your link profile. Google has stated if you have "on the fence" links that they can't classify, a good link profile may mean you get the benefit of the doubt on those links, a bad link profile may mean all those links are discounted too. Much like if someone you trust and someone you don't trust both tell you a hard to believe story, you are more likely to give the trust-worthy person the benefit of the doubt!
Google Web Stories are an AMP format which is another avenue to explore to get more search traffic. If you're running a Wordpress site, Google recently released an official plugin to help you make them. Google Web Stories: https://developers.google.com/search/docs/guides/enable-web-stories
There is no more or less "trust" in any particular TLD you buy. For instance, a .UK domain is not 'trusted' by Google more than a .TK domain*. Yes, some domains are used more for spam (and there may be a perception problem) but you start on an equal footing algorithmically. *I am referring to a concept of trust, ignoring the fact that ccTLDs for instance may have an advantage on gTLDs for geographic reasons.
The 'News' filter that is now present in Google Search Console will only report clicks that happen within the "News" tab of Google Search. It won't report on news clicks that happen in the "All" (Top Stories) results. I discuss this on episode 71 of the Search With Candour podcast.
Spend longer in the ideation phase of your Digital PR campaigns, I have seen too many link building and PR campaigns fail because "okay" ideas got off the drawing board. A "great" concept is x100 easier to get links for than a "good" concept. h. Invite criticism, ask questions, talk through what else you can do with ideas.
To do logfile analysis, you'll likely need to combine both server logs and logs from your CDN, which can be daunting. Luckily for you, Suganthan Mohanadasan has produced the most impressive guide to this I have ever seen.
You can use Google's Indexing API to "directly notify Google when pages are added or removed. This allows Google to schedule pages for a fresh crawl, which can lead to higher quality user traffic." Google says only to use it on Job Ads / Streaming pages, but it actually works on any page. Here's the quickstart guide to I
You can improve your CLS score on infinite load pages by removing any footer with content in it that continues to get pushed down. If you think about it, users won't be able to access it anyway, so it's likely just frustrating them!
Page speed is not a huge ranking factor at the moment. Page speed is super important for loads of other reasons, but you're not going to directly lose rankings because a page takes 5s instead of 3s to load.
If you are having to merge or change URLs including pages that rank or show things like featured snippets, it is always a good idea to verbatim copy over the paragraphs that are ranking when you setup the redirect to try and maintain those positions. I have just moved over a page that was ranking no1 with a featured snippet to another URL and maintained the snippet.
If you're stuck in a rut for content ideas, using a tool like BuzzSumo can quickly show you which content is popular and being shared around a topic. It's a great way to kickstart your ideation process!
It's possible to serve different web experiences based on user-agents, meaning if your site has specific issues with bots, you can serve a "bot-friendly" version to Google, Facebook and the like.
The $ sign works as a 'stop' in terms of pattern matching for robots dot txt. For example, if you wanted bots not to crawl car.php but you did want them to crawl all of the model pages such as car.php?model=1 and model=2 then you would use the rule: User-agent: * Disallow: /car.php$
Many automated SEO audit tools will return "duplicate content" issues for URLs that correctly use hreflang tags. If, for instance, you have almost identical pages for an English (UK) and English (AU) page that correctly use hreflang tags, you don't need to worry about "duplicate content". One of many examples where automated tools can give false positives!
I'm going to say it as a reminder because it's the second time in the last 30 days I have dealt with this! When you're pushing a new site live, do a quick scan to make sure that your noindex tags and robots dot txt are allowing both crawling and indexing, otherwise, it's not going to be a great start to a site launch!
"Crawl budget" (when SEOs talk about it) is a concept that refers to the number of pages Google may crawl on your website. It's normally only a thing that sites with lots of pages need to worry about - but if you've got a site that is hundreds of thousands of pages, it is often beneficial to think about optimising where you are sending robots.
If you're doing a crawl with Screaming Frog, it's a good idea to do at least one of them with the Spider set to "HTML only". This will give you a much closer view of what Googlebot is seeing when it looks at these pages before they are rendered by Caffeine. The option is in Configuration > Spider > Rendering
W3C validation is not a ranking factor. If you're being told to look at W3C validation "for SEO", this should be a red flag. W3C validation is useful to avoid errors - and of course, if you're HTML is utterly broken, that can cause issues, but strict validation itself is not going to affect rankings.
Fetch & Render is a much better way to see how search engines understand your pages rather than relying on cache commands. If you want to fetch and render a page that you don't have Google Search Console access to, Sitebulb has a single page Fetch & Render tool which is a really fast way to check how a page will look to a search engine, based on a specifying a user agent!
If you're running multiple locales over sub-folders, it's really helpful to setup separate properties within Google Search Console. This gives you a super quick way to get insights into that locale, without having to worry about filtering data - huge time saver!
Ranking fluctuations are normal. Even changing nothing on a site, you can see positions rise and fall a couple of spots on a weekly or even daily basis. There are a lot of moving parts, thousands of algorithm tweaks over the year, changes in the link graph that is powering you and your competitor's sites, competitors changing things. Don't be too quick to assign action or inaction to these small changes. Drastic changes or trends you can see over months are what you need to act upon.
Don't fret if you see different rankings on the same search term on two different computers. Even on an identical computer, location, IP, time, signed in (or not), it is possible to see the same site ranking in two different positions. Google's infrastructure to serve results is vast, shifting and there is no one instance of "the index" - just a norm that everything is synchronising towards.
Based on the Web Core Vitals, which we know are going to be a direct ranking factor in 2021, Google Chrome 85 will start marking "fast" pages in their UI, if the page provides a good experience, based on Web Core Vitals. Chromium blog announcement: https://blog.chromium.org/2020/08/highlighting-great-user-experiences-on.html, Web Core Vitals as ranking factors: https://withcandour.co.uk/blog/episode-63-core-web-vitals-as-ranking-factors-and-discovery-ads
If your Screaming Frog crawl is taking ages and running out of memory because it's getting stuck down a rabbit hole, you can live edit the exclude list to get the spider back on track. Take note of where it got 'stuck' as it will be a great place to circle back to with log/tech analysis (h/t Nick Wilson / Fabrizio Ballarini)
Your sitemap should not include links to pages that 301/302 redirect, non-canonical pages, or pages you don't want crawled or indexed (such as robotstxt excluded or noindex tagged).
You can instantly import all URLs from a sitemap file into Google Sheets using: =IMPORTXML("https://www.mysite/sitemap.xml", "//[local-name() ='url']/[local-name() ='loc']") This blinder was courtesy of Steph Whatley
If you need to quickly deploy schema you can't get access to the codebase, your CMS doesn't support the type you're after, or have some other problem, it is possible to add JSON-LD schema via Google Tag Manager.
Even if structured data doesn't give you a rich result, it will help search engines understand the page better.
Adding a blog post to 2 or more categories does not "give you duplicate content". This is something I saw given as advice today and worth expanding on. Most content management systems (like Wordpress) allow you to create blog posts that have a URL that is agnostic of its category - or at least, set a "primary" category. This means, whichever category it is linked to from, it only has 1 accessible URL. One URL = no duplicate content.
Even if your e-commerce site doesn't use categories in product URLs, it is good to have a "primary" category assigned in the backend, this will allow you to generate breadcrumbs on the page when a user lands directly on a product page from the SERP without completing a journey on the site first.
When writing content, it sometimes pays to reformulate headers as questions, rather than relying on the user to be "in the flow" of the content to understand the context. This allows parts of information to be self-contained, easily scannable - and often reflects how people phrase search terms, meaning you're more likely to get a featured snippet result. (This is also backed up by SEMrush's Featured Snippet study, that found Q&A sections regularly managed to get into snippets)
When trying to figure out which angle to take with content, I will always look at how the site is currently ranking. If the site has a good backlink profile and already has some good rankings, a content strategy that focusses on the "breadth" of content can produce quick results. In these cases, sometimes just creating pages can get you ranking and traffic, you can then circle around and improve them later. Where this isn't the case, I tend to focus on fewer, big "one-off" pieces to try and win more links to get us into this situation.
You can't always determine what the best content format is just by what Google is currently showing, you have to use common sense! I have seen companies decide not to make video content because Google was not showing video results for their search terms, only to have competitors do it and then take those top spots. Just because Google is not showing a specific format of results does not always mean that it won't, it can just mean that the right content, in that format isn't there. Video is definitely a format where there are still lots of opportunity to land grab in many SERPs.
Don't let the perceived "weight" of ranking factors dictate your strategy. "Is SEO 90% links and 10% everything else?" is a question I had last week. Links are still are the main driver of rankings for most competitive terms, however, it would not be a good strategy to put 90% of your time into building links and 10% everything else. Overlooking the fact that technical flaws could diminish any chance whatsoever of ranking, let's talk about content. If you can't honestly look at a piece of content you have that you want to rank and say "this is definitely better than our competitor's", then why are you expecting Google to rank it? You'll spend vast amounts more on link building to get it to rank and then when it does rank, those links won't keep coming in unless you continue to invest in building them. Spending more time making sure the content is right means your link building will be much easier, a better return - and the links will keep on coming after the effort stops.
If you're doing outreach, it's okay to contact some key websites prior to putting any work into the content. Ask them if they are interested in the idea, get their feedback on what they might be interested in. You can work together to make something mutually beneficial and save a campaign from otherwise flopping!
You can't fight intent and you'll waste a lot of money doing so if you try. Let's take the insurance industry as an example. For many insurance terms, comparison type websites take the top positions. This is reflective of user intent, they want comparison sites to be top, not lots of single vendors they need to browse through one at a time. In situations like this, doing keyword research to find other, more niche terms as a way in - and simply making sure you are visible on the sites there ranking first is a completely valid SEO strategy, otherwise you can be banging your head against a wall chasing terms you "want" and making very little progress.
Getting a 'green light' on Yoast, getting a 100% Site Score on SEMrush, or whichever tools you use does not equal SEO success. Don't use these things as metrics, use them to help you uncover blind spots and feed things into your plan for prioritisation.
Don't use the rel="canonical" tag on paginated page sets to point page 2,3,4 etc back to page 1. If you have a view all page, you can do this, otherwise, these canonical tags should be self-referential. This is one of the most common misuses of canonical tags I see and it means they commonly get ignored. Other top mistakes: https://webmasters.googleblog.com/2013/04/5-common-mistakes-with-relcanonical.html
Be careful if you are letting dynamic content that users can generate be indexed. For instance, the Indeed site (which is a bit 'naughty') allows their internal search pages to become indexed to try and capture longtail searches. Unfortunately, it means things like this happen 😂
When using structured data, there are "required" and "recommended" fields. Missing recommended fields will get you "Warnings", but the structured data will still be valid and work fine. Missing "required" fields will result in "Error" warnings that stops the structured data working at all. Make sure you're clear with your developers about which are "required" so they are included and try to get make information accessible to get those extra "recommended" fields in too!
Whenever you're embedding a video of yours, even if it's from YouTube, it is worth including Video schema to help search engines understand the content, provide additional meta information and improve your chances of being visible in the SERPs.
If you have a very small website, say 15 pages, the photo shows what an "in-depth technical SEO audit" is like for you. I've had people asking me to do technical audits on sites as small as 8 pages and I have said no. You won't get much value from them, any SEO expert worth their salt should be able to spend 30 minutes and give you a shopping list of what you need to do, it's not complicated. Apart from a few edge cases, you'll be much better off focusing on other priorities.
Hitting "view source" is not a reliable way to see if a page has a noindex tag, I have seen a few people confused by this. Firstly, noindex tags may be added in by Javascript "after the fact" so you need to view the rendered DOM (F12 in Chrome), not the source. Secondly, it is also possible to deliver a noindex instruction via HTTP headers with X-Robots-Tag. You can use a browser extension that makes all of these easy to detect. Chrome extension: https://chrome.google.com/webstore/detail/seerobots/hnljoiodjfgpnddiekagpbblnjedcnfp
You can declare an hreflang "twice" for a single URL and it is valid. For instance, you can say a URL is both EN-GB and just EN (English language). This may sound redundant, but there are reasons, such as products may only be available in specific countries. Hreflang can get complicated! :)
Your URL structure is not your site structure. You can structure a website with internal links any way you like, independent of URLs.
For your most important pages, as yourself "how easy is it for a user to find a link to this page?". Without any technical knowledge, this will give you a good idea if your internal linking is good. Do they have to go through a menu, then sub-menu to find that page? Is it hidden away in a footer? Or is it a clear link on a popular page such as your homepage?
Google now supports shippingDetails markup that you can add to your Product/Offer structured data. This will allow those without a Merchant Centre shopping feed to be included in displaying shipping details directly in the SERPs - which can be important for buying decisions. More info: https://webmasters.googleblog.com/2020/09/new-schemaorg-support-for-retailer.html
Google sometimes words things very carefully, so it is worth scratching below the surface when making decisions about your site. For instance, Google made a statement that there is no specific SEO effect as to whether your website has adverts. or does not have adverts on it. However, if you have ads that are obstructive of content or negatively impact UX (as measured by Web Core Vitals), this could have a negative impact - so don't take some statements on face value!
In Screaming Frog Configuration -> Robotstxt you can set "Ignore robotstxt but report status". This can be helpful to see everything you are blocking with your robotstxt file and see if you have any 'spider traps' on the site.
If you need to experiment with robotstxt rules, Merkle has a really nice, free online tool where you can simulate rules on made-up URLs. Just select "Editor" mode instead of live and you can enter any made up URL to quickly test how your rulesets will work! More info: https://technicalseo.com/tools/robots-txt/
Google currently has two separate issues with indexing, with some sites having pages dropped completely. Google says webmasters don't need to take any action - but there is something you can do. If you find a page has dropped, using the "fetch and request indexing" in Google Search Console will get you back in within a few minutes.
There's a really neat report on Sitebulb for duplicate content that allows you to go past the usual "finding duplicate titles" to indicate clashes. The report separates out pages that have identical HTML, very similar content, page titles, and h1s. It's really helpful for non-standard content management systems where sometimes duplicate content can sneak in under different titles.
If you need to remove results from Google quickly, you can also use the "remove URL" tool within Google Search Console. It is important to note that this tool only temporarily blocks the page from appearing, it is still indexed, so you will still need to add a NoIndex tag to the page to remove it permanently!
If you have conflicting robotstxt rules of the same length and specificity, Google will choose the least restrictive rule, the one that allows it to crawl.
You can include href links within your FAQ structured data to make clickable links appear within the SERP.
Canonical tags can help optimise issues with filtered pages, but if the majority of your crawl is canonicalised pages, or you're crawling 100,000s of these pages, either reworking the internal link structure or using robotstxt to block crawling of those pages entirely may be better solutions.
If your Screaming Frog crawl is taking ages and running out of memory because it's getting stuck down a rabbit hole, you can live edit the exclude list to get the spider back on track. Take note of where it got 'stuck' as it will be a great place to circle back to with log/tech analysis.
Be careful that security features that restrict bots visits don't impact search engines. For instance, we recently experimented with Imunify360 on cPanel, which said the default configuration would whitelist Google. As you can see from this snippet, this was definitely not the case!
Without going into ongoing monitoring, technical SEO recommendations from audits are mostly finite in nature. Yes, new technology changes and new things become possible (see Web Core Vitals), but for the most part it is not necessary to have a "technical audit" on a monthly basis. I puzzingly see this offered as a service by some SEO agencies. Do your audit, make your priority changes and move onto your content/outreach. Yes, you might need monitoring for tech changes, yes, it may be worth doing again in 12 months, but don't sink endless money into something with finite returns.
You don't need to try and optimise for misspellings! When you misspell something, Google will almost always replace your actual search with what it thinks you're looking for - or - try and rank what it thinks you're looking for. Here's a great test for you, try searching for "Facebook" but press 1 key to the right with every letter (GSVRNPPK)
Google has 'passage indexing'. To clear up some confusion here, Google is still indexing entire pages but can independently assess individual passages from multiple pages as the best result, rather than the page as a whole. My prediction is we'll start seeing a broader and more diverse range of 'smaller' domains for specific, longtail queries. This is even more reason to make sure you're researching and collating all of those specific questions customers are asking and getting them on your site! More discussion on passage indexing on episode 83 of the podcast.
Looking to hire? Using Jobs schema means you can get your job ads directly into Google's Jobs feature, rather than relying on people trying to find your website. Here is Google's documentation on jobs schema.
If you do a rebrand and you have a lot of old customers who are used to the old name and still search for it, it's likely that Google will take a while to "catch up" with this. This means, despite the rebrand and migration, Google may choose to overwrite your title tags! More info from Google's Gary Illyes here.
March 2021 is Google's line in the sand for being a "mobile only" rather than a "mobile first" index. This means that any content or markup not available on a mobile version of your site from this date, will be ignored. Everyone has mobile-friendly sites now, right?
Knowing which of your links are in the 'raw' HTML and which are created by Javascript is absolutely vital. Sitebulb has released a BRILLIANT new feature in v4.4 that will automatically tell you if links on your site are created or even modified by Javascript. A brilliant timesaver that stops you from having to do a diff on spreadsheet. Great work!
It has been a requirement for some time now to include the timezone in your event schema start/end times. So instead of something like "2025-07-21T19:00" it will look something like "2025-07-21T19:00+01:00" This wasn't always the case and I have come across many sites with schema that ius now broken. Valid event schema means you can get rich snippets in Google like the Norwich Science Festival!
If you have a new site built that complies with cookie options and doesn't automatically set analytics cookies, you might find you have a big "drop" in traffic. I've seen people panic over a perceived loss of organic traffic when the root cause is this tracking. Don't forget you can use Google Search Console to directly see "clicks" from Google to confirm if organic traffic has risen/dropped/stayed the same.
Search engines like Google are not 100% reliant on alt text to understand images. I've mentioned the importance of unique, good photography before - but you can now try out Google's Vision API for free to get an idea of what Google sees in your photos. Try Google's Vision API yourself!
Honestly, meta descriptions are low priority. 70% of the time Google will just replace what you've written, they don't directly affect rankings - so they are only really something you need to consider putting effort in, once the page is ranking.
Some people still need to hear this: Running Google Ads does not directly impact your organic rankings. It doesn't. Nope! 🚫
If you think you should rank for a specific key phrase (or intent), there is one simple question I like to ask businesses starting SEO: "Which one page on your website should rank for this query?". If you can't quickly tell me which one page should rank, then how are search engines supposed to work it out? Does this page actually answer the query? Is the answer kind of spread over several pages? The answer most of the time won't be "the homepage".
Google has spoken about using TTR (Time To Result) as an internal metric for how well they are doing. That means, how long it takes the user to satisfy their search query. Assuming your page meets the intent of the searcher, a good question can be "How can we improve the time to result? How can we make it easier or faster for the user to find the answer?" Some common ideas are:
It's a great question to explore things outside of "ranking factors". The worst possible outcome is you make things better for your users!
If "content and links" is your SEO plan for 2021 and beyond, you're going to find yourself fighting a losing battle. As Google focuses on measuring "expertise, authority and trust" your long-term SEO plans need to embrace the paradigm shift of Google recognising "entities". A great example of what is coming is from earlier this year when Google was granted a patent called "Speaker Identification", which allows them to identify a speaker by using speech recognition, such as on YouTube videos. If a user can easily recognise a "celebrity expert" talking about something on your site, your YouTube, your podcast - why shouldn't search engines? It makes perfect sense this would go into the equation.
All pages should have canonical tags. If they are the canonical version, it should be self-referential. This means if your page is linked to with query strings (e.g. from a marketing campaign) you have the best chance of link signals being consolidated and improving ranking.
Block crawling with robotstxt or use a canonical when dealing with filtered navigation? It can be a hard call. Where the filters are not directly crawlable, canonicals tend to be the right answer. If you have multiple filters (and combinations of) that are crawlable, it is likely better to just block crawling entirely or the majority with robotstxt as the permutations quickly go into the 100ks or millions which will cause issues itself. Of course, it's best to fix the underlying issue, but that's not always possible!
Cumulative Layout Shift (CLS) is one of the Core Web Vitals that is going to be included as a ranking factor as of May 2021 by Google. Explaining the measurement of CLS ("You have a 0.7 CLS") can be hard, especially to get buy in. Here is a fantastic free tool that will generate an animated GIF of your cumulative layout shift, which is really helpful to visualise what is going on!
As of May 2021, non-AMP pages will be eligible to be included in Google "Top Stories" results as long as they adhere to the news publisher guidelines!
Looking at Google's auto-complete can give you some solid guidance on search terms you should target and what people are searching for. Most of these systems are powered by the frequency and regularity of searches... Which does mean they can be gamed. Want to test this? Go to Amazon and search for "cooking with " right now...
If you have some critical issue which forces you to take down pages (such as ransomware), 302 redirects can be really helpful. Using 302 (temporary) redirects on internal pages will keep them "preserved" within Google's index, so as soon as you can get the proper content back on, you can remove the redirect and things will resolve faster than if you let pages go 404 or 5xx error.
Why is that content a PDF? We're mobile-first now and PDFs generally offer a poor experience on mobile devices. Yes, Google renders PDFs as HTML and links inside them "count", but if you're intending that content to be read on your website, I would consider making it an actual web page and keeping the user within your website to read it. You'll get better engagement, you can do more with internal linking and generally, you'll rank better. PDFs have their place (if it must be a download, or for getting content hosted on external sites) - but in my experience, they exist because of "old school" thinking around nice looking brochures.
Changing a site's underlying infrastructure like servers, IPs, you name it, can change how fast and often Googlebot crawls from said site. That's because it actually detects that something changed which prompts it to relearn how fast and often it can crawl.
You've got your XML sitemap file, but do you have an Image sitemap? There is loads of traffic locked up in Google Image search that can be a less competitive way to get good traffic. Image sitemaps help you get as much coverage as possible. More info here.
🚨New🚨 and rather cool (to me at least) crawl stats report in Google Search Console. Have a look, it's hidden away in the "Settings" menu on the left. Loads of interesting information, even the "purpose" of the Googlebot visit!
If you run a tool like Screaming Frog and you get blocked or rate-limited from your own site, you should check that it isn't doing the same to Google. While some 'smarter' systems will verify if a Googlebot user-agent is actually from Google, I have encountered quite a few sites that end up blocking Google's crawlers!
Don't fall into the trap of just picking the highest volume keyword phrase out of a set of variations to target. I'm not even talking about competitor difficulty, I am talking about intent. Unless Google is showing an identical or near-identical set of results for the variations in the search phrase, it's likely they have searchers looking for a different thing. For example, "personalised" and "customised" are two quite different things!
Optimising your CTR is a great way to get more traffic from search engines. When measuring CTR, you need to do this on a page-by-page basis, taking into account the ranking. I saw comments last week about "poor CTR" on a screenshot I posted from Google Search Console that included all pages/key terms. If you have lots of position 10, 15, 20s etc for many different keywords, these will still get impressions - but - naturally no, or very few clicks. This is exactly as you expect, it doesn't mean your CTR is actually "poor". Don't judge CTR on a single, out of context number!
When you're trying to forecast SEO and set targets, it is usually a good idea to base that on 'unbranded' traffic. One of the easy ways to do this is to export your Google Search Console data and then remove any branded and non-set terms. You'll then know the number of unbranded clicks you have vs branded, so you can start to make some better predictions on growth.
While it is possible, I would strongly advise against specifying your canonical tags via your XML sitemap. Your sitemap only lists your canonical pages anyway, this means Google and other search engines still have to work out which are the other duplicate, non-canonical versions of pages. Declare your canonical tags either on-page or through HTTP headers.
Google has confirmed when it comes to using Web Core Vitals as a ranking factor in May 2021, it is the Mobile scores that will be used, not the desktop scores. While this is what many of us have suspected, it is good to have confirmation it isn't an average or mix of the scores in some way.
Google treats "out of stock" e-commerce pages as 'Soft 404s', meaning they are less likely to appear in search results.
If you're promoting a Zoom webinar or similar, do so on a URL that you control on your domain and redirect to the Zoom-controlled signup page. That way, once the webinar is done, you can remove the redirect to retain the link equity and relevance around the subject of the webinar and use this page (maybe to host a recording of the webinar or transcript).
If you have a ccTLD (e.g co.uk, .fr, .de) and you want to target multiple countries, you will find it harder if you just use subfolders as your top-level domain is already country-coded. Either use separate ccTLDs or move to a gTLD and use subfolders. Usual hreflang rules apply.
Google specifies some ccTLDs (country-code Top-Level Domains) that it treats as gTLDs (generic Top-Level Domans). These are: .ad .as .bz .cc .cd .co .dj .fm .io .la .me .ms .nu .sc .sr .su .tv .tk .ws
H1/Title optimisation: H1s normally function as the "title on the page" for user as the actual title is usually truncated in a tiny tab at the top of the browser - they won't look at it. While title tags need to be optimised to keep them short so they appear in SERPs, it's an opportunity to be more concise with H1s. This opportunity lets you target slightly different (but same intent) search phrases and make the page a bit more user-friendly. Tl;dr: Just duplicating the title tag in H1 can be a missed opportunity!
As part of their rendering, Google will look at things like how big and where the text is on your page. Therefore, if you did something like make all of your h1s and h2s the same size as the paragraph text via CSS, it is likely their importance would be reduced. The same applies in reverse - even if it's not an h1 or h2, content styled to be more prominent is a hint it is more important.
Majestic now has a great link graph visualiser tool. It's a really useful way to quickly identify new link opportunities in neighbourhoods of relevant sites that may be 3 or 4 "hops" away. Previously, this would have been easy to miss, as you're normally working from "1 hop" text-based exports.
Although it's currently required to be in things like Top Stories, having an AMP format version of a page is not a ranking factor - in fact - unless it is the canonical or only version of a page, the AMP version isn't even indexed!
You can vastly improve the success rate of your content and outreach by getting your audience involved in the planning stages, rather than just trying to create content in a vacuum.
Kristina Azarenko has released a very nice Chrome extension for SEOs that pulls together many features of other plugins, such as checking schema, meta robots, canonicals and much more.
Start how you mean to go on in 2021 with your SEO. As we discussed in our 2020 roundup of SEO, fundamentals are still incredibly important. What are the fundamentals? Lily Ray summed it up so well with "If it's good for users, you should treat it as a ranking factor".
Google Question Hub is a resource you can use to find questions that users are searching for, where Google believes there is no "good" answer on the web yet. Rather than enter the answer in Question Hub, you could make a page about it yourself!
Noindex tags and others can be inserted via javascript with Google Tag Manager. This is useful to know in case you can't get access to developers or edit code directly. It does mean there will be a delay between Google processing the HTML and not seeing the tags and the final rendering of the Javascript - but it will work! In this instance, a page would be dropped from the index after javascript is processed.
Leaving testimonials for industry-specific tools that you like and use can be a really easy way to get some quality links to your site.
If you're doing outreach to journalists, be aware of what their day looks like. Meetings happen early to decide which stories are going ahead, so if you do your outreach at 3pm, it's more likely to get lost in the noise. Get in first thing in the morning and I try and avoid Mondays/Fridays for most projects.
Domain Authority is a made-up, compound metric by Moz. Google does not use Domain Authority for ranking. Google has even said they "don't have a metric for measuring overall domain authority". While DA and other metrics, such as Majestic's Trust Flow (TF), can be helpful to make decisions, you should be careful they are not used as a source of truth or success.
If you have Javascript-reliant content pages, it isn't just as simple as "making them work" with Google, there are several options such as server-side rendering, dynamic rendering, pre-rendering, and rehydration which are all different and useful for different purposes. We'll cover them all separately in upcoming tips!
Dynamic rendering is good for indexable, public Javascript-generated content that changes rapidly, or content that uses JavaScript features that aren't supported by the crawlers you care about. It involves the user getting the standard client-side rendered version of a page and crawlers being delivered a version where the server renders the Javascript first.
More on types of Javascript rendering: Server rendering generates the full HTML for a page on the server in response to navigation. This avoids additional round-trips for data fetching and templating on the client, since it’s handled before the browser gets a response. Server rendering generally produces a fast First Contentful Paint (FCP). Running page logic and rendering on the server makes it possible to avoid sending lots of JavaScript to the client, which helps achieve a fast Time to Interactive (TTI).
The new 'Google News' report in the "Performance" section of Google Analytics only covers news clicks from the Google News App or the Google news subdomain. It does not report on clicks from "Top Stories" or if you click on the "News" vertical in search (that is still reported in "Search Results"). Image from Search Engine Land.
Looking for crawl anomalies in your Google Search Console report? As an update to tip #205, they have now been removed from Google Search Console, so you won't find them! The Coverage report has been upgraded and you'll now see more granular detail on crawl errors. You can get some more information on episode 94 of the podcast.
If you have completed a site migration and want to see if old URLs are left in the index, you should use Google Search Console to do this. Do not use the Google site: operator. The site: operator can show pages that Google knows about but that are not included in their search index!
Searching for a text string and search for an entity (with an identical 'text string') are two different things in Google. To provide an example, in the below video you will see me search for "Pimoroni Ltd" which will return their shop homepage at number 1. In the second search, you will see me start to search for Pimoroni Ltd, but instead click the suggested entity in the dropdown list. This still looks like it searches for "Pimoroni Ltd", but it actually is an entity-based search. You can see the difference (in this case, I believe bug) in how Google fails to show their homepage, only showing a "returns" page 4th, after Facebook and Amazon.
A common mistake I see is websites make is using Q&A schema when they should be using FAQ schema. Q&A schema is for pages that ask one question and where multiple users can submit answers, something like Quora. If you're just writing a list of questions and answers on the page as a site owner, you should use FAQ schema. The image below shows the valid and invalid uses of Q&A schema (QAPage):
The Web Stories format is indexable, self-hosting, appears in the regular search results but also will get surfaced in Google Discover and even in Google Images with a special icon to show it's a website story. More info can be found here.
Despite what scammers may try and tell you, "text to html ratio" is not a thing that will prevent you ranking in Google. To quote John Mueller (from Google) on it: "We don't use anything like text to code when it comes to Google search. We especially pick up the visible content on the page and we use that. Some pages have a lot more HTML, some pages have a lot less HTML. That's more a matter of your kind of design preferences, how you set things up on your site."
Another reason to ensure your e-commerce site includes Product schema is that Google is introducing a "price drop" feature in the SERP. If you include a specific price (not a range) in your Product structured data, Google will now automatically calculate when a price has dropped (based on the running historical average of your product's pricing). In their example, Google show this as 'Price: $200 (typically $300)'. Currently only available in the US, in English, on both desktop and mobile devices. I got this update from the brilliant Sitebulb schema change tracker.
Knowing if organic traffic came from your "Google My Business" (local) listing can be really helpful. It is helpful to add a UTM tag onto any link you put in Google My Business, so you can easily track where the traffic came from in Google Analytics. Here is a tool to generate your own UTM link codes.
Apple Maps strips parameters such as UTM codes, making it hard to track organic clicks. One way around this is to use a different URL with Apple Maps, with a canonical tag back to your "regular" location page. Within this page, you can also fire a virtual pageview within GA that inserts your UTM code if you want to keep everything neat.
When trying to forecast your SEO, you need to factor in that your rankings are not a static target that exist in a vacuum. Sites that rank for competitive terms are rarely there by chance - they are doing SEO. This means if you do cruise past them in the rankings, they're unlikely just to smile and wave, they'll also increase their efforts. This is why those last few positions are so hard fought and it's almost impossible to predict where you'll rank.
Python is an accessible way to automate, reduce errors, and complete otherwise impossible tasks within SEO. In memory of the wonderful Hamlet Batista, a mentor to many in the SEO industry for Python, including myself, this wonderful "SEO Pythonistas" resource has been setup to preserve the undeniable and profound impact he had on the industry and many individuals in it.
Common source of confusion is comparing organic traffic in Google Analytics with Google Search Console. The numbers are not comparable, they are different metrics! In Google Analytics: 'Users' are individual humans, 'Sessions' are the visits those humans make but in Google Search Console you have 'Clicks' which are just that, clicks from the search engine result page. These numbers won't (and shouldn't) match up, but the most accurate representation of how you're doing from a search point of view is from Google Search Console. You lose fidelity as soon as you get into Google Analytics, especially with various cookie/tracking blocking.
Internationalisation is not just about language and culture, it's about technology too. You may run a lab test ad get 3 'green lights' on your Core Web Vitals and believe everyone is fine. However, when you login to Google Search Console, you see you're in the red because the majority of users are perhaps from a country where the average internet connection is much slower. In cases like this, it may be that you literally have to build a 'lighter' version of your site to serve these regions adequately. This is all part of internationalisation!
Setup monitoring for things like domain expiration, please. Just do it. Your domain is what stores all of the 'equity' you have earned over the years from links, it is one of the most important ranking factors. What many people don't realise is that "drop catching" is a multi-million £ business. If you let your domain drop, someone will very likely grab it, and if they will sell it back to you, you'll have to pay through the nose. There are some incredibly cheap solutions to prevent problems like these such as Little Warden. I had a lovely chat with its founder, Dom Hodgson last week where we went into depth about this issue.
You can use a 503 HTTP result code if you are temporarily closing the functionality of your site, encouraging users to take a specific action that involves replacing or redirecting your normal content which could damage your rankings. Returning a 503 HTTP result code tells search engines to ignore the current content, and to come back again a bit later. Here is some more info from Google:
Is it a ranking factor? Thinking about individual ranking factors in isolation can be misleading, but if you want to consider if something is or could be a ranking factor, here are 3 questions you can ask yourself:
Ranking can be complicated - but also more simple than we make it.
'Direct' traffic in Google Analytics is not only people that 'directly' type in your URL, it is anything that Google Analytics, for whatever reason, cannot assign a traffic source for. Some studies have shown that up to 60% of direct traffic is actually organic. This is really worth keeping in mind if you are assigning value to traffic based on Google Analytics data. Here's a study about direct traffic.
It's possible to get Google Search Console to give render previews of specific parts of your page using regular jump links. Super helpful tip brought to my attention by Saijo George from the tl;dr marketing newsletter.
The Core Web Vitals report in Google Search Console is based on field data (CrUX report) - an aggregate of real user's browser data. This means there can be days/weeks of time lag for issues to get flagged. While it's a great insight into generally how your site is doing, you should set up automatic "lab" monitoring, such as Lighthouse to find issues fast.
There is a lot of focus on Core Web Vitals at the moment, especially the May deadline when they become a ranking factor. Reddico have released a free tool that lets you measure CWV by individual keywords for the top 10 ranking sites. Super useful stuff!
Redirect rules generally need to be left in place as long as possible, which can mean big lists are generated, potentially generating latency. By "big lists", I mean you might have 5000 lines in a redirect file, all listing "URL A redirects to URL B". It's worth reviewing redirects occasionally with a developer to see if you can process them more efficiently, such as with a rule set to speed things up. This means you might be able to have 10 rules which say something like "If a URL has this pattern, alter it and redirect it here".
It's worth having a good understanding of risk if you're going to break Google's guidelines. While no sites are permanently banned from Google, receiving a manual action (AKA a 'penalty') will normally take months to resolve. You need to think about the eventuality of when you will be caught and what the long-term plan is.
Having great links is important, but one thing I have anecdotally observed is that new links (even ones that are not 'authoritative') seem to help keep a site ranking well on SERPs that change regularly. Logically, it makes sense the new links signal that content is still useful/relevant and I've seen several projects now where we've secured number 1 and it has slipped after a few months and is won back by very minor new links. While it could be competitor noise/activity, the fact the links are very minor makes me doubt this and the existence of the new links seems to be the only prerequisite. This is definitely a more anecdotal tip, but it's something after observing hundreds of sites for many years that I would be happy stick my name against!
302 or 307 redirect? Strictly, a 307 is a more specific "temporary" redirect, while 302 is a bit more ambiguous. However, 302s have been around a lot longer so you can be sure how search engines will handle them. To be safe, I still tend to use 302 redirects, although I wouldn't expect any major difference.
If your site uses HTTP Strict Transport Security (HSTS) to force https, your browser will remember this. Why is that important? It means a browser like Chrome will automatically insert a 307 (temporary) redirect if you try and enter an http URL. These redirects don't actually exist (the site may be using a 301, for instance). It's important to know that so you don't get confused with phantom 307s on browser extensions.
Part of seizing SEO opportunity is risk mitigation. If you're planning a seismic change to your site, such as combining some ccTLDs together into a gTLD or making big changes to a template, it can be wise to do this in stages. Even a perfectly planned migration can sometimes go wrong and a well-meaning template can have unwanted side effects. So, start with migrating a single country and see how that goes or migrate that one category to the new template then wait, measure, assess, and make your decision there.
You can find what tactics your competitors or their SEO agency using right now by looking at "new" links. Most backlink tools will have a newly discovered backlinks filter. I love running this report on Majestic to instantly see if a client's competitors had any spikes of recent attention and what they are working on with content and links.
A common reason for "not mobile-friendly warnings" such as "clickable elements too close" is Google has not been able to access the CSS and Javascript of a page to render it properly. If you're getting these warnings, it is worth running the page through the Google Search Console renderer to see how things look.
Want to know how close you are to winning a featured result or rich snippet? Here's a little hack: Run the search again and minus off the domain that ranked to see who is next in the "queue". You can minus off multiple sites at once. This used to work with the "site:" operator, which Google stopped - but it works like this now!
Time spent on a page is not a ranking factor in my opinion. Google has talked about "Time To Result" as one their main internal metrics for success. Time to Result reflects how long it took a searcher to find the answer they are looking for or fulfill their intent. They want this number to be as low as possible, demonstrating a frictionless experience. Think about your own searches when you want to find that specific bit of information, would you like it clearly highlighted on a page or would you like to dig through 2,000 words to find it? What does that mean for your time on page? Time on page is important when a user wants to spend time reading something in detail, it's not a generic metric you should be using without context to content. Please take these random "UX factors" with a pinch of salt!
I rarely pay attention to what tools label as "toxic links". Firstly, many of them are just plain wrong, with lots of false positives. Secondly, they are only guessing at what Google considers "toxic", they have no actual way of knowing. Thirdly, unless there are specific cases, blindly disavowing what these tools deem as "toxic" can actually harm your rankings if Google itself was actually counting these links.
If you're setting up automated crawling of your site as Googlebot, it is beneficial to make the origin of these crawls the US. Googlebot is normally crawling from US-located IP addresses, so also crawling from this location can catch edge cases, such as automatic geo-IP redirects (which you hopefully don't have).
Working on site performance can be hit and miss in terms of things you think might help, but then don't. This is compounded by the time lag of development cycles to get things live and test them. It's possible to use LocalOverrides within Chrome to test the potential impact of speed changes locally, before getting them in the pipeline to go live. This can save a huge amount of time and make sure you get it right the first time around. Here's a guide for you!
If you're struggling to get links from journalists, providing them with assets to go along with the story (photos, images, data vis, etc) that are licensed under 'use and remix' Creative Commons can ensure that you are attributed for their usage! More info on Creative Commons here.
Most log file analysers will offer an option similar to Screaming Frog's "Verify Bots". This will allow you to remove "fake" search engine bots from your logs, such as if you ran a crawl yourself as Googlebot or if your SEO monitoring system does.
If you're trying to get your internal team up to speed with SEO training but not sure where to start mapping out requirements, Aleyda Solís put together this amazing resource that maps out the different areas of SEO and provides links to learn more.
You don't have to always redirect all the URLs during a site migration. What on earth do I mean? I'm currently working on a site that has 500,000 pages indexed but only gets around 2,000(!) visitors per month from search. The majority of these pages have no links and no landing page traffic. The impact of your migration should exceed the migration cost, otherwise, what is the point? Migrate value, not any URLs just because a guide said so.
Google won't index things after '#' in a URL, with the exception of hashbangs (#!) Although it was deprecated years ago as part of their AJAX crawling, you can still see these URLs being indexed today which can cause duplicate content issues, don't get caught out thinking they will be ignored! Link to the Google documentation here.
While the raw number of links to a website can give you an indication of how popular it is, it is always worth looking at the ratio of links vs links from unique domains. This will quickly tell you if the backlink profile is made up of many unique mentions or just lots of sitewide type links.
If you're using search volume numbers for a keyword, it is a good idea to look at when that data was taken and see how it compares on Google Trends. Otherwise your forecasting/reporting can be very far off!
Avoid stock photography if you want to rank in Google Images. Image results contain unique images, not the same image over and over.
The majority of topical intent can change over time from: consumer not being aware of a problem, to searching for solutions, to comparing specific solutions, to the final stage of comparing brands. Having this strategic understanding of where the market is, is fundamental to your SEO and content approach. One way you can track this is monitoring the change in People Also Ask suggestions over a period of months and years and classifying them. Hugely useful insight!
If you need that 'hook' for your story, digital PR, or link building, using FOI (Freedom of Information) requests in the UK can be really powerful. The advantage is that this information won't be publicly available, so can be a massive asset to get featured by journalists. They can take a while to come back, but rabbits can be pulled from hats! In my opinion, it is a really simple thing that is drastically underused.
Clickthrough rate is not part of Google's "core" ranking algorithm. However, a spike in clickthrough rate will often get a site to temporarily rank better, as Google tries to model demand. I have never seen this effect last.
Google can index multiple filetypes, HTML, PDF, Word documents... Even... Lotus files! 😂
After you have completed a site migration, you should also update any internal links to the new locations. This isn't as hard as it sounds, because you already have the [from]/[to] URL data from the redirects, so a developer can help you do what is essentially a "find and replace" job with the content on the site.
Sometimes it isn't possible to rank number 1 for keywords that you might at first glance, think it would be possible for. Let's take the keyword "headphones" as an example. Imagine you had a premium product of £300, wireless headphones - absolutely the best on the market. That's brilliant! But it doesn't necessarily mean you'll be able to rank number 1 for "headphones". Why? Because it is likely that the majority of people that search for "headphones" want to see a range of options or they are generally after basic or entry-level products.
URLs in tweets that show in SERPs are counted as impressions in Google Search Console
In case you missed it, it was announced the Google Page Experience algorithm will also affect desktop results, not just the previously announced mobile search.
You can now add direct apply markup to JobPosting schema. This property gives websites an optional way to share if your job listing offers a direct apply experience right from Google Search. Here is Google's information on JobPosting schema.
Google has announced what they are calling the "link spam update" which impacts links that "involve payment, sponsorship, or are otherwise commercial in nature." including guest posts. More info on the "link spam update" can be found here.
The Aardvark SEO addon is now ready for primetime with Statamic v3, bringing...
In this episode, you will hear a SearchNorwich recording of Aleyda Solis...