Candour

Episode 120: DirectApply schema, Facebook tracking, GoDaddy listings and Google responding to fresh queries

Play this episode:

Or get it on:

What's in this episode?

In this episode, you will hear Mark Williams-Cook talking about:

DirectApply schema: The new attribute for JobPosting schema and new editorial guidleines from Google.

Facebook tracking: How iOS14 tracking preventing is affecting advertisers in the real world.

GoDaddy listings: Google teams up with GoDaddy to enhance e-commerce inventory

Googling fresh queries: A look at how Google is responding to and ranking changes in news and demand.

Show notes

DirectApply in JobPosting schema https://developers.google.com/search/blog/2021/07/job-posting-updates

Facebook Users Said No to Tracking. Now Advertisers are Panicking https://www.bloomberg.com/news/articles/2021-07-14/facebook-fb-advertisers-impacted-by-apple-aapl-privacy-ios-14-changes

Ref: ep 98 - “iOS14, ATT and Google passage ranking” https://withcandour.co.uk/blog/episode-98-ios14-att-and-google-passage-ranking

Transcription

MC: Welcome to episode 120 of the Search with Candour podcast recorded on Friday, the 16th of July, 2021. My name is Mark Williams-Cook. And today we're going to be talking about direct apply in a job posting schemer, the impact we're seeing of iOS 14 on iPhones and tracking. We're talking about GoDaddy's e-commerce rollout with Google, and actually some more algorithmic stuff with Google about how they handle fresh and critical content. So loads in this episode for you. Before we kick off, I would love to tell you this podcast is sponsored by the wonderful people at Sitebulb. Sitebulb if you haven't heard of it, is a desktop based SEO auditing tool for Windows and Mac. It's been around for a while now, and it's really had a big impact on the SEO community. It's an incredible tool for auditing your sites. They've recently released their newest version, which includes a really nice ability to check your Core Web Vitals at scale.

So before this, there was a few other ways you could do this, including using command line interfaces to go through all of your pages, but Sitebulb takes care of all of this for you. Of course, we've got the page experience updates still rolling out at the moment. So really good time to do this. With Sitebulb you can just set it crawling your site, and it'll do these lab tests on every single page, group the results together for you, and give you the feedback you need. If you're listening to this podcast, there's a special offer for Search with Candour listeners. If you go to Sitebulb.com/SWC, that's Sitebulb.com/SWC, you'll get an extended 60 day trial of the software. No need to put your credit card details or anything like that. So you're completely free to give it a go and see if you like it, which I'm sure you will. Sitebulb.com/SWC.

Okay, we're going to kick off talking about job posting schema, and this is a really cool little find that only surfaced three days ago, and I found it through Carl Hendy from their wonderful agency, Reddico, he highlighted this on Twitter. And there is a post about this which we'll link to in the show notes, which you'll find at search.withcandour.co.uk, and it's on the Google Search central blog, and they've posted updating job posting guidelines to improve quality of results for job seekers. I'll go through the post, but first, just give you the headlines from this, which is there is a new direct apply property which you can add to job schema. And this is essentially, if you have the ability on your site to directly apply for the job, i.e., fill out an application, you can add this, or verify that this is a URL, you're able to do this, and get a special listing in Google. There's also some helpful advice here from Google about general job listing and getting your pages for job listings listed and ranked in Google.

So I'll read the post to you and just go through the details of the direct apply property. "Searching for a job can be a time-consuming process, and the outcome of the application may be life-changing. That's why providing job seekers with authentic, fresh and trustworthy content when they come to Google Search is our top priority. To better understand our users perspective we ask tens of thousands of job seekers around the world to tell us more about their experience when applying to jobs online. Based on this feedback, we were able to identify common themes to help improve the quality of our results. Today we're announcing a new structured data property and new editorial content policy." So the new structured data property is what I just spoke about, this direct apply property. So the direct apply property is an optional way that enables you to share if your job listing offers a direct apply experience. We define a direct apply experience in terms of the user actions required to apply for the job, which means that the user is offered a short and straightforward application process on your page.

You would likely offer a direct apply experience if your site provides one of the following experiences. One, the user completes the application process on your site, or two, once arriving at your page from Google, the user doesn't have to click on apply and provide user information more than once to complete the application process. So fairly straightforward, but very, very useful if you are listing jobs on your site and using that schema. And the details as well, there is some new editorial content policy. To ensure users can understand your content and easily apply for the job, we're adding a new editorial content policy for job postings on Google Search. The new editorial content policy include guidance around obstructive text and images, excessive and destructive ads, or content that doesn't add any value to the job posting. Job listings should also follow basic grammar rules, such as proper capitalisation.

This will help us improve the quality of our results and develop new functionality within the product. To provide sufficient time for implementation, the new editorial content policy will go live on October the first, 2021. So here we go. They've given us as well this really nice overview on the feedback that they got from the job seekers. Based on our research findings, you can improve jobs you could trust by addressing the following aspects of your site. One, verify there are no scammy or spammy job posts on your site. These are job posts that don't represent a real job opportunity. Make sure that you only markup pages with a single and actionable job opportunity. Two, ensure a good user experience. According to our users, sites with poor user experiences are those that ask for user information when it's not necessary, have poor quality pages, for example, excessive or obstructive ads, and, or, have a complex application process, for example, leads to many redirects. Poor user experience also reduces application completion rate.

Three, remove expired job posts. Don't leave a job post open if it is no longer accepting new applications. There's a little bit more text to that, but I think that's fairly straightforward that one. Four, make sure the jobs posting date is genuine. Users use freshness as a signal to assess if a position accepts new applicants, chances to get hired, attractiveness of the position and more. Don't mask old jobs as new ones and don't update the date posted property if there's no change to the job post. That's quite an interesting one. Don't include, this is lastly, don't include wrong or misleading information in the job post or the markup. This includes incorrect salary, location, working hours, employment time, or other specific job details. So a lot of these would be fairly common sense, you'd hope, but we've all seen job sites do this with really old expired job ads and sites that are crammed with adverts as well.

So it's interesting because it means that Google is obviously now looking at ways they can algorithmically discount these sites. So as they said, you've got a little bit of time until the beginning of October if you think any of these apply to you to sort it out. And we'll probably set up some monitoring on some job sites to see if we can see any impact when these new editorial content policies go live, and then I can do a follow-up maybe in November, or near the end of the year, but really interesting information, new schema, new editorial guidelines for job listings.

Something we've spoken about before on this podcast, especially in relation to pay-per-click, is iOS 14. So way back in episode 98, we did an episode entitled iOS 14, ATT and Google passage ranking. You'll find a link to that episode in the show notes. And we discussed the forthcoming changes that were planned in iOS 14. So that's the iPhone, Apple's iPhones operating system software, that we're going to start asking users if they would give each individual app permission to track them. And we were talking about the potential ramifications from this because apart from individual apps, which obviously use that data, the really standout one was Facebook, because Facebook is used a lot from people's mobiles devices, it's very, very common people using Facebook on their mobile.

And apart from being able to track that user's interaction with adverts to see if they're buying, or generating the lead, or doing what the advertiser wants them to do, there is a very deep level that this behavioral information on aggregated, but individual users in groups is applied when it comes to Facebook doing things like their look alike audiences, which is a way for advertisers to effectively build new groups of people based on various algorithms Facebook has to expand their marketing basically. And it's very effective actually. And lots of people with, even with limited skill sets in digital marketing have managed to find success with Facebook doing this. And now we have seen these privacy changes roll out. We're originally talking about up to 90% of people opting out. It doesn't look quite that bad, but there is an article this month on Bloomberg which, again, I will link to in the show notes, that really goes to show the impact that this is having on advertisers.

So the article is actually called, Facebook Users Said No to Tracking. Now Advertisers are Starting to Panic. People will give iOS apps permission to track their behavior just 25% of the time, according to research in this article. So I'll just pull out one paragraph from this article that summarises this, which says, "Facebook advertisers in particular have noticed an impact in the last month. Media buyers who run Facebook ad campaigns on behalf of clients said Facebook is no longer able to reliably see how many sales its clients are making. So it's harder to figure out which Facebook ads are working. Losing this data also impacts Facebook's ability to show a business's products to potential new customers. It also makes it more difficult to retarget people with ads that show users items they've looked at online, but may have not purchased. A Facebook spokesman declined to share what percentage of its users have accepted the company's tracking prompt, but roughly 75% of the world's iPhone users have downloaded the newest operating system, according to Branch."

And then the data they further give on in the article says that they estimate only one in four people is actually saying, yes, you can track me. So I'd be interested if anyone else has got any feedback from this. I actually had a conversation today, it wasn't meant to be, about this. I was talking to another agency owner in the UK and he was saying how this has caused him problems with his agency, just because clients rightfully so now are concerned that it looks like their Facebook campaign performance has dropped off a cliff when actually what's happened is a lot of these sales are now going unattributed, so we can't actually prove they're from Facebook. And as this paragraph points out, it then becomes difficult to do your job as a digital marketer, which part of that is getting the greatest efficacy out of this spend and working out which ads are working and which groups of targeting is working and which isn't, because if you don't have the data, or you're missing a large chunk of it, at least, that becomes very, very challenging.

So, that's something that's rolled out, and we were talking about various potential fixes that were in the works, it doesn't look like any of that's happened. So I'll be really interested to see how this pans out, link, as I said, to the Bloomberg article, it talks about the potential impact on Facebook's revenue and obviously share price. And this is because the direct impact will be if advertisers can't get the same value from their spend, the same return, they will spend less, because they'll have to just go for the easier ads with the tighter targeting. So we could see a little shrinkage in the market there because it, I think it's not easy to overestimate the impact that Facebook as a platform has on the digital marketing world. So really interested to see how that's going to happen and seeing what creative solutions people come up with.

While we're in the point of the show, I would like to introduce our sponsor, Wix, who has this update for you? URL customisation on Wix is now available on product pages. You can now customise URL path prefixes, or even create a flat URL structure if that floats your boat. Plus Wix automatically takes care of creating 301 redirects for all impacted URLs. Full rollout coming soon. Also, fresh off the press, bot log reports. Get an easy understanding of how bots are crawling your site without any complicated setup right inside of Wix. There's so much more you can do with Wix. You can now add dynamic structured data, upload redirects in bulk, including error notification and warnings, and fully customisable meta-tags, and the robots.txt file. You can get instant indexing of your homepage on Google while a direct partnership with Google My Business lets you manage new and existing businesses listings right from the Wix dashboard. Visit Wix.com/SEO to learn more.

I haven't seen this mentioned elsewhere, but I thought it was interesting. I caught this announcement three days ago on July the 13th, which is that there is a new integration for GoDaddy merchants, which means they can show up across Google. And we've covered this a couple of times now when we've talked about Shopify and we've talked about Google opening up shopping listings for free organically, and we've talked about the reasons they're doing that in terms of competing with Amazon and trying to desperately expand their inventory so that when people have this, I need some shopping intent, and maybe they're not after one specific product, that they don't just go straight to somewhere like Amazon and start shopping. Google want you to stay there and they want you to have this in search experience where they're curating all of the different brands and products across all of these different merchants, organising them for you, comparing them for you, and ideally giving you a way to check out directly through Google. That's definitely the experience they're aiming for.

Google is going to pitch that, whether it's good or not for the merchants in the end, and there's a whole discussion about merchants owning their customer and the value of them knowing your brand. But Google is going to play this as, this is what the user wants, because it's a faster frictionless experience for the user, you don't have to worry about site speed, or site security, everything's there. The downside being that it will make consumers more brand agnostic, or at least to the actual merchant brands, and it might increase affinity with Google as we've seen happen with Amazon, but this is what's happening. So there is a post again on the Google this time, The Keyword blog, of course we'll link to it, Search.withCandour.co.uk. And it's entitled, In just a few clicks, GoDaddy merchants can show up across Google.

And Google have written, "Shoppers get the most choice when they can easily discover businesses and their unique products." So what we just spoke about really, "And when those products get discovered, businesses can connect with more customers. We see it as a win-win, which is why we're working hard to make commerce more open online. One way we're doing this is by teaming up with e-commerce platforms like GoDaddy. Starting today we welcome GoDaddy online store customers to more easily integrate their product inventory across Google at no additional cost. This means that GoDaddy merchants can now get discovered across search, shopping, image search and YouTube in just a few clicks. With this integration," and that links off to GoDaddy site, "GoDaddy merchants can upload their products to Google, create free listings and ad campaigns." I had a quick look at that, it's actually a, for those that qualify Google will do a match spend, "And review performance for metrics all without leaving GoDaddy's online store. By teaming up with platforms like GoDaddy, we're able to help even more businesses make connections with shoppers who are eager to discover new brands."

So, if you've got a GoDaddy store, it's a no brainer that you'd want to do that. This is, as I said, there's definitely a downside I think to this for merchants. However, if you don't do this, you will be at disadvantage. So there is that, I think it's called a Spanish prisoner dilemma. If you all didn't do it, then that would be fine, but the fact is there's an advantage to doing this thing, so some people will do it, which of course then pushes everyone to take part. So this is happening again, more expansion of this Google inventory for shopping.

This I found super interesting. Google published a video, it's almost had 30,000 views already, it was only published four days ago on July 12th, and it's called, How Google Search reacts in critical moments. And it's talking about how the algorithm reacts to essentially when fast news happens and results need to change quickly. So I'll let you listen to the video. It's only a couple of minutes long, and then we'll just have a quick chat about it.

Whenever there's a significant event, we want people to be able to find the reliable information as quickly as possible. To do that our search systems are designed to recognise if a query is trending and surface fresh content in real-time, particularly during a crisis. So if something major is happening near you, we can quickly give you helpful and reliable results. Critical context, like the time and location of searches, help us determine if something is a trending search, like when news is breaking. For instance, let's say you start searching for fire near me, and a lot of other people in your area are looking up the same thing. When our systems detect this uptick in similar searches and can also see there's a lot of fresh content available regarding fires, we're able to recognise this local fire as a trending search. This means results like top stories, which link you to local, or international news, will be found towards the top of the page for searches about the fire.

Same with SOS alerts, which make essential information more visible, so you can easily find things like maps of the affected area, emergency phone numbers, or donation opportunities. Of course, when it comes to providing real-time updates in critical moments, ensuring the quality of the results we're surfacing is extremely important. It's why we weigh signals of authoritativeness more heavily during situations like these. Designing our systems to respond to all these factors is how we're able to provide up to the minute results that are relevant to your search and in nearly real-time. Connecting you to reliable information when you need it most.

So there's actually quite a lot to unpack there. Hopefully there's some things most of us won't find that surprising, which is talking about how Google uses trending data, uses locality of trending data in terms of searches, the amount of new content appearing to work out, what kind of results they should be showing. And we've spoken about this before in several different strands really. We've talked about things as well, how the intent shifts in a query. So one of the examples we commonly use is Halloween and how at, well, close to Halloween, we tend to see e-commerce sites ranking and further away, other points in the year we tend to see just sites that give more about the history of Halloween, because the actual intent behind that search changes. And we see something similar with this concept of a certain query deserves freshness, right? Which is that if I google a specific term that's chronologically sensitive, I ideally would like to see more up-to-date news.

And as we heard in the video there, the most obvious and common example is news, which as it says on the trend is normally news. So, that being fresh is important. The more interesting part of this is when we get these more unpredictable trends in searches, which might be, for instance, if a certain subject comes up, which then changes which site should be maybe ranking for that query, because people are doing a search, or there's a trend of searches and they're maybe not looking for what is currently the top result. And I just have some casual observations I want to share with you about this. And one is this very commonly brought up topic about click-through rate in searches. Lots of people believe that the click-through rate is part of Google's core algorithm and the more people, the high percentage of people clicking on your results when they google something, the better. You can see the logic as to why that might be true. Maybe when you think about it a bit deeper, you start to realise there's a lot of problems with that.

Obviously Google, well, a couple of different people at Google, Paul Hart, Gary, off the top of my head, have all highlighted, or either stated that, that's just plain not the case, and in Paul's instance in a great talk he did, he went into why using that data in a core algorithm can be trickier than it might first seem, but we have, on the other hand, seen instances where people have done fairly decent tests and they've demonstrated that they've been able to manipulate how sites rank for specific queries by fudging the click-through rate, by forcing it to be higher. And this, I think, sits on maybe these layers that sit on top of Google's core algorithm, which is the interesting thing about these ranking changes we've seen when people have manipulated click-through rate has been in all the cases I've seen at least, these changes have been temporary.

So they have managed to improve the ranking of a specific page for a query, but it's only lasted, from what I've seen, normally a matter of days. And as soon as that trend dies out, the page goes back to where it was. And I think this is one of those bits that sits on top of Google's core algorithm. What they're talking about here, which is they still have that core ranking for, okay, these are the sites we think should rank for this query, but then there's this, we know there's additional layers that manipulate the actual results Google has in terms of things like when featured snippets are generated, that's done after the rankings are processed. To me, that's where that would fit in. So Google's saying, well, okay, this is how I think the sites should be ranked, but everyone seems to be interested in this at the moment. So just for now, we'll surface this because we think this is in line with this trend.

And certainly as we talked about before, again, when we had Lily Rayon talks about it in the video there, they talk about specific critical information, they lean to more authoritative websites, which again, Google has been circling around this kind of expertise, authority, trust, which is a whole set of different metrics that Google is using to judge that both on and off page. And the other, I guess, casual observation I wanted to bring up, and this is a super casual one, was just, I was doing some link analysis work with someone this month. And I was noting how at the time, one of the datasets we were looking at was the Majestic trust and citation flow metrics that they calculate for pages. And I was just commenting about how there was a fairly strong correlation in the dataset we're looking at of how pages ranked and Majestic's trust flow, but it wasn't the case with the citation flows.

So this meant that as we're looking at these results, basically the sites were ranked in order of Majestic's trust flow, but the citation flow was basically all over the place. And the super simple way to think about this is that the trust flow looks more at how authoritative the site is. So yet how trusted it is, where a citation looks more at, just the frequency of how many times it's talked about and cited. So number of links versus what you might call authority, right? And I was commenting about how certainly this used to maybe not be completely the other way around, but definitely several years ago I would have seen a much stronger correlation with that citation metric rather than the trust metric.

And going hand in hand with this, we were looking into some specific cases of websites that we knew were buying links, and pretty rubbish things, to be honest. And the interesting thing that we saw was that these sites that were buying poor, these rubbish links, which obviously against Google's guidelines, were seeing very fast ranking improvements, but they were also temporary. So we were seeing then jump up for some specific times they bought links for, and then essentially they would go back to where they were after a couple of weeks. And we were talking about the possibility of a similar type of signal being used for links in that we obviously you've got this big link graph that Google's looking at and doing various calculations stuff like page rank on.

And it would make sense to me that if a whole bunch of new links appeared to a specific page, mentioning it for a topic, this is a similar signal to the ones we've just been talking about, right? In terms of, oh, hey, lots of people are searching for this, or lots of people are clicking on this. There seems to be interest here. So it makes sense to me that Google might adjust that dial temporarily saying, oh, hey, we found suddenly all of these new links. I haven't looked into them yet, but let's just bump this ranking up a little bit, because I think there's a fair chance it might be important, but what's happening then, and I would guess maybe as those links fit into Google's link graph, and by that I mean, all the sites and pages that link to those pages are re-crawled, Google runs that calculation and can say, well, actually we can see these are pretty crap links, right? There haven't gotten links from any other good sites, or they're all from unrelated sites or links that look dodgy.

So Google's line and what they do with such links is they ignore them. So they don't pass any equity, if you want to call it, over those links, thus the rankings go back to what they were. But then this leaves us with this, I guess not final question, but another question of, well, we've heard about people using disavow without a penalty. So they haven't got a penalty from Google, a manual penalty, but they've disavowed what they thought were a chunk of bad links and they've seen a ranking improvement. Why would that make any sense if the statement of Google simply ignores bad links is true? And I think that statement is true, but it's not the whole story. And I have heard Google's comment on this before, which is that if you have a, say poor backlink profile, so you have a whole bunch of links that Google says, okay, well, this is definitely paid, or this is definitely a bought link, I don't like this link for whatever reason, doesn't like those links and thinks they're spam.

Those links are not going to directly, negatively impact your rankings, but what it does do is there will be maybe this bell curve, or this chunk of links in the middle that Google, or other search engines, maybe on the fence about, in that they don't look great, but they're not really sure they're paid, or we're not really sure about them. And having lots of bad links then might reflect poorly, or change the probability that the rest of those links that are on the fence are good or bad links. So it makes sense that if a search engine works out, you definitely have a large proportion of dodgy bought, paid spam links, that the ones that are on the fence there's a fair chance that they also aren't good links. So they may discount those links as well.

Whereas in the opposite case, if your link profile is super good and you're only getting links from good sites and it doesn't look like you've been spamming, or you've bought any links, if you then have these links on the fence, I think it's more likely that they will be counted and you will get credit for them. And that answers several, or fills several gaps for us, which is that the statement that Google ignores bad links, i.e., doesn't give you direct penalties, or demote your site based on these individual links, unless you get a manual action, of course, is true. It explains why in certain cases we've seen people who have disavowed chunks of bad links have then afterwards seen an uplift. And it makes sense in terms of a, even a human model of if you know somebody tells lots of lies and they tell you a story that's hard to believe, you're likely to discount that also as a lie. Whereas if somebody you know is always telling the truth and they tell you a story that's hard to believe you will probably give them the benefit of the doubt.

So I just thought it was really interesting concept to go over, especially as Google started talking about this, and they've opened up a tiny bit more, there's obviously no technical detail in there about how they're treating queries that deserve freshness and especially in these critical moments. And, of course, Google has made, again, a bunch of statements and improvements around really important results, like the stuff with COVID and marking what's been fact checked and what hasn't. So lots of improvements there, lots of things to think about in terms of the range of signals Google might use, and especially what we can do with links. That's all we got time for in this episode. I really hope you've enjoyed it and you'll tune in again next week. We'll be back on Monday the 26th of July. As usual, if you are enjoying the podcast, please do subscribe. Tell a friend all of those lovely things. And, of course, as usual, I hope you have a lovely week.

More from the blog