Episode 1: Google Shopping changes, ITP and a core algorithm update
Changing in Google Shopping, ITP breaking tracking within Google Analytics...
Or get it on:
Mark Williams-Cook will be talking about:
Pagination problems There is a significant pagination bug present in the WPEngine system and Google have updated their best practise advise regarding pagination.
Date published abuse Google reaffirm best practise for articles using datePublished and dateModified as some SEOs cite concerns Google may not have a grip on this as publishers are abusing it for rankings
Google My Business fraud reporting In case you missed it, Google My Business as launched a new form which enables you to report competitors or fraudulent listings within Google My Business.
Show note links:
Google's post on dating articles: https://webmasters.googleblog.com/2019/03/help-google-search-know-best-date-for.html
BrightLocal's guide to reporting Google My Business spam: https://www.brightlocal.com/2019/02/28/how-to-report-google-my-business-spam/
MC: Welcome to episode 2 of the ‘Search with Candour podcast recorded on Friday the 22nd of March 2019. I've got some new equipment so hopefully we'll be able to hear a slight improvement in audio quality. My name is Mark Williams Cook and I'm gonna take you through about 15 minutes to discuss the latest search news and hopefully make your lives a bit easier.
This episode I'm going to cover some interesting developments with SEO for pagination - there's actually two bits of news to talk about there, with problems with sites hosted on WP engine and a surprise announcement by Google around signals they're using to handle pagination. I'm going to talk about web page dating as well, it makes it sound like tinder for web pages but there's been a recent post on the Google Webmaster blog around how they handle dating on pages and that's interesting because it's something that drastically affects how they rank. There's also one last thing I want to catch up on if you've missed it which is a new forum for ‘Google My Business’ which allows you to basically grass up your competitors who are breaking the rules or report fraudulent listings!
Pagination
Okay so starting with pagination hopefully everyone's comfortable with what pagination is whether you're a webmaster or an SEO or just generally working with websites you would have seen it; where you have maybe products or lists of articles and you've got so many they need to be split over multiple pages, and at the bottom of the page we've all seen those links where you have numbered lists (1, 2, 3, 4) and a previous and next link. Google's given advice before on best practice to handle pagination, they've actually provided us with research before that said if latency isn't an issue so that means if the page load time isn't an issue their research has suggested that users do prefer a view-all experience. There are still lots of situations where we do have to break these view-all lists down over multiple pages but that's their preference for search engines, and they've actually said that's what users prefer. That makes sense to me because when I see a paginated list, I don't know about you, one of the first things I normally do is try and increase the number of results I can see it at once so I can see as many as possible on one page. Pagination does create some challenges for SEO, mainly because it creates a set of pages that are very similar - they change in content, they do provide important deep links to other products or pages and the things that we want search engines to discover but they tend to have similar page titles and they tend to be quite thin in content if they were standalone pages on their own.
Since 2011 part of Google's documentation has recommended using what they call rel=”next”' and rel=”previous” attributes which will help the bot understand that it's looking at a set of paginated pages. This has actually become a standard feature even in many content management systems and certainly it would have made its way into many technical SEO audits over the years. It certainly has for the ones we've done, it's an obvious easy win because it's been coming directly from Google. However yesterday on the 21st of March Google dropped something of a bombshell from the Google Webmasters Twitter account saying:
Spring cleaning!
— Google Webmasters (@googlewmc) 21 March 2019
As we evaluated our indexing signals, we decided to retire rel=prev/next.
Studies show that users love single-page content, aim for that when possible, but multi-part is also fine for Google Search. Know and do what's best for *your* users! #springiscoming pic.twitter.com/hCODPoKgKp
John Muller actually clarified some follow-up questions. So John Muller from Google saying:
We noticed that we weren't using rel-next/prev in indexing for a number of years now, so we thought we might as well remove the docs :).
— 🍌 John 🍌 (@JohnMu) 21 March 2019
and actually the pages that recommend using those attributes have just vanished from Google's documentation, they're not redirected anywhere they're just gone! This caused a fair bit of surprise in the SEO community, because it's something that Google has said over the years that we should be doing and now they've come out and said “oh actually we haven't used that for a few years”. Google did say “Googlebot is smart enough to find your next page by looking at the links on the page, we don't need an explicitly labeled “previous, next signal.” But there are other great reasons why you'd still want to leave those on your page. it's important to to remember that the realm prev/next is actually a web standard so it's not just a Google thing. This announcement caused some other questions from the SEO community, which Frédéric Dubut who's the web ranking and quality PM at Bing, tweeted saying:
We're using rel prev/next (like most markup) as hints for page discovery and site structure understanding. At this point we're not merging pages together in the index based on these and we're not using prev/next in the ranking model. https://t.co/ZwbSZkn3Jf
— Frédéric Dubut (@CoperniX) 21 March 2019
If you don't have them and you're in a very Google centric country like the UK, just the priority of adding them is probably lower than it than it was before now. While we're still on pagination - I don’t think I’ve ever had this much to say in one day about pagination! We're going to talk about a major pagination issue/SEO issue with WordPress sites hosted on WPengine. So WPengine is the WordPress ‘digital experience platform’ as they self-described, they're a big deal and they serve more than 90,000 customers globally and this year they reported an annual recurring revenue of $132million, so about £100million. I caught this article on a blog called Beanstalk Internet Marketing and they describe the issue talking about pagination, saying that on WPengine sites this works fine by default until you get up to page nine. So for users and BOTS clicking on page 1, 2, 3, 4 etc. works fine, once you get past page nine and you enter double digits the bots (so just Google, just Bing etc.),not users, are redirected to the home page. This will mean when a user clicks on page 10 or 11 they'll land on that page but when a search engine explores those pages they'll be redirected back to the home page. That's quite an important issue because that's going to, especially, on larger sites impact how search engines are discovering their content.
Beanstalk did to speak to WPengine who recognized it was an issue and they said that it’s happening because of a setting on our platform called ‘redirect BOTS’ it can be turned off but by default it is enabled, and as of the 13th of March WPengine has added a front-facing control for users to not only know about the pagination but redirection discussed, and also you can turn the feature on and off now on this front-facing control.
I imagine there's probably lots of people listening and lots of companies that are running sites on WPengine that haven't necessarily logged in in the last few weeks, so if you haven't go and do that now!
Web Page Dating
On to web page dating! So on the 11th of March Google, wrote a blog post called 'Help Google Search know the best date for your web page' and they wrote “Sometimes, Google shows dates next to listings in its search results. In this post, we’ll answer some commonly-asked questions webmasters have about how these dates are determined and provide some best practices to help improve their accuracy”. On how dates are determined:
“Google shows the date of a page when its automated systems determine that it would be relevant to do so, such as for pages that can be time-sensitive, including news content:
“Google determines a date using a variety of factors, including but not limited to: any prominent date listed on the page itself or dates provided by the publisher through structured markup.
“Google doesn’t depend on one single factor because all of them can be prone to issues. Publishers may not always provide a clear visible date. Sometimes, structured data may be lacking or may not be adjusted to the correct time zone. That’s why our systems look at several factors to come up with what we consider to be our best estimate of when a page was published or significantly updated”
Then they go on to how to specify the date on a page “to help Google to pick the right date, site owners and publishers should:
So that's accelerated mobile pages. This blog post and I've linked to it in the search notes at search.withcandour.co.uk goes on to list some other specific best practices around publishing dates, including ‘show when a page has been updated’. If you update a page significantly, also update the visible date and time if you display that. If desired you can show two dates - when a page was originally published and when it was updated, just do so in a way that's visually clear to your readers. If showing both dates, it's also highly recommended to use the date published and date modified for AMP and non-AMP pages to make it easier for algorithms to recognize.
Now I thought this news was particularly interesting and I don't think I was alone in this. so Barry Adams who is an SEO that works with some major news outlets, so this is particularly relevant for him, tweeted:
Implicit admission from Google that they suck at detecting an article's actual publication date:https://t.co/RgMKtAeG6Q
— Barry Adams 🧩 (@badams) 11 March 2019
News publishers have been abusing Google's weakness in this area for many years. 😈
I think it's fair to say at least the second half in my experience of what Barry has said is true even outside of news sites, in that I've seen some very dubious quality old pages ranking on primarily what seems to be some on-page schema to update or to show a last modified date. Google seems to be adding dates in their search results on a lot of articles even on what I would consider to be evergreen type content where the date isn't necessarily I think that important, and definitely SEO’s have been picking up on this and it certainly worked its way into their toolbox of things to exploit - for those kind of people that want to do those kind of things. My guess would be that using the date modified attribute rather than creating a brand new page is allowing you or allowing the SEO to leverage existing page authority, existing links, historic popularity of that URL (whatever you want to call it). I've seen this happen myself personally you know in a few niches that are seasonal. So when whatever it is that time of year comes around again where you where you know the searches are going to spike, rather than creating new pages I've seen companies kind of doing in fairness what was I would say not significant rewrites of the content just as an excuse to add a date modified schema and an updated date onto the content.
Based purely on that it appears that that Google sort of puts them back to the top of the search results. So I've seen articles from 2008/09 now appearing at the top of the search result when i've done a search and specified to Google I only want to see articles from the last 12 months because i'm telling them that the freshness is important. The guidelines Google's published do specifically state that the update to the content should be significant and not a small addition to justify the inclusion of a ‘date modified’. I'm not aware of any tests or direct experiences people have had with exactly how much or how drastic that content change needs to be to meet that measure, we've certainly seen some solid demonstrations from Google that they'll ignore these kinds of webmaster-led signals so things like the date modified where it contradicts what they're saying on the page.
I recently posted an example of this with canonical tags where Google had confirmed that if there's something they determined to be non-equivalent pages with canonical tags they will just ignore the canonical tag. I wrote a blog post recently about a test where I tested this exact thing and exactly what Google said would happen happened, in that we tried to get a page essentially to rank for two sets of search terms using the canonical tag and Google just completely ignored that canonical tag on the page. The reason I've highlighted this post from Google is that in my experience when there are things that are easily exploitable by SEO’s that most people don't know about and most people aren't doing them, Google tends to be not overly vocal about them and that's understandable because they don't want to essentially publicize a thing you can do that's exploitable because more people will do it. Then there are situations like this one where I think there is a mechanic that SEO’s can exploit to maybe get rankings they don't deserve, and the cat’s kind of out the bag and people are doing it quite a lot. Google currently doesn't have a robust way to deal with it, so what we tend to see is blog posts like this where we get reaffirmations of best practice to try and shepherd people into playing by the rules.
In fairness I do suspect this means Google's kind of closing the net as always to catch people doing this but as it stands from what I'm seeing, it is possible for people to misuse this date modified attribute to use the markup to get rankings they don't deserve.
Google My Business news!
Lastly we're going to close with some great news for businesses that have been having issues on the ‘Google My Business’ platform.
The ‘Google My Business’ platform is what powers the business information boxes that pop up when you do Google searches for brand names, and the information about businesses you see in the local map boxes when you do a search. It's a really powerful tool, it can help you get in front of customers really quickly and of course because of those two things it's heavily abused: all the way from people stuffing keywords into their business names to try and rank, to people creating completely fictitious businesses that are the front essentially of something a bit more sinister so just straight-up fraud basically. Until now if you've had problems with competitors breaking the rules on ‘Google My Business’ or with fraudulent spam listings you'd have to go and create a post on the official ‘Google My Business’ forum, and essentially hope one of the users there could come to your aid.
I know a lot of people that have had problems with business listings and also may have missed this launch, so this is why I wanted to highlight it because Google's launched what they're quite clinically calling “the business or address or complaint form”, again it's linked in the show notes at search.withcandour.co.uk and it's basically an easy-to-use forum to grass up people that are breaking the rules. There's a good step-by-step guide I found on a site called BrightLocal by Jamie Pittman and he explains that you essentially fill out the form, make sure you carefully read the guidelines that are linked to within this form so a Google staffer is going to be manually looking over your request and they're going to judge your complaint against these guidelines. You need to make sure what you're claiming is breaking the guidelines and you can specify why it's misleading or fraudulent. You need to enter your information even if you’re a local marketing consultant or you’re an agency representing other businesses. You'll need to enter your name and your email address, you have to select the fraudulent content in question so you have to specify: is it the title, the address, the phone number, the website for example that's fraudulent. It does appear if there's multiple things wrong about the listing you do have to submit multiple forms. You've also got the opportunity to write in detail why the content is malicious or fraudulent, and Jamie says he can't stress enough how important this level of detail is.
There was a recent webinar with Google Gold Product Expert Ben Fisher, who explained the importance of writing clearly, professionally, respectfully, and giving the absolute most amount of detail possible in relation to the guidelines. This makes the Google staffer’s job easier and basically means your report is more likely to go through, be taken seriously, and be actioned. So that's there for you use it responsibly.
Okay that's everything for this week, you can get the show notes and links to everything we've talked about on search.withcandour.co.uk. The next episode of Search With Candour is going out on Monday 1st April.
I'm Mark Williams Cook and I hope you'll listen again.
Changing in Google Shopping, ITP breaking tracking within Google Analytics...
From this week, you'll be able to get a short (that means less than 20...