Does Google use user behavior metrics for ranking
Does user behavior impact rankings? What has the Department of Justice’s antitrust trial involving Google uncovered about user behavior and rankings?
This week, Wix’s Mordy Oberstein and Crystal Carter investigate how much Google directly utilizes user behavior to impact rankings and what that should tell us as SEOs.
SEO Content Strategist at Search Engine Journal, Shelley Walsh, joins the show to cover the Department of Justice antitrust lawsuit against Google and what was revealed there about rankings and user behavior.
The verdict is in as we pass judgment on user behavior’s impact on Google rankings on this episode of the SERP’s Up SEO Podcast!
Episode 71
|
January 24, 2024 | 49 MIN
This week’s guests
Shelley Walsh
Shelley Walsh is a digital consultant with over 20 years creative, marketing & tech experience. In the last 14 years, she has been published extensively across industry publications and spoken at many events online and offline. She works with SEJ as their SEO content strategist and as an SEO content consultant for brands in the UK and US to help them achieve measurable results. Shelley also produces The Pioneers, a series of interviews with influential people about SEO and the early days of the industry.
Notes
Hosts, Guests, & Featured People:
Danny Sullivan, Google's Search Liaison
Resources:
Ex-Google Search head worried his team was ‘too involved with ads’
What Pandu Nayak Taught Me About SEO
Why SEOs should watch content trends carefully
News:
Google ‘Circle to Search’: The New AI-Powered Search Gesture For Android
Google Ranking Teetering In & Out Over Weekends - Bug Or Edge Of Quality?
Notes
Hosts, Guests, & Featured People:
Danny Sullivan, Google's Search Liaison
Resources:
Ex-Google Search head worried his team was ‘too involved with ads’
What Pandu Nayak Taught Me About SEO
Why SEOs should watch content trends carefully
News:
Google ‘Circle to Search’: The New AI-Powered Search Gesture For Android
Google Ranking Teetering In & Out Over Weekends - Bug Or Edge Of Quality?
Transcript
Mordy Oberstein:
It's the new wave of SEO podcasting. Welcome to SERP's Up. Aloha, mahalo for joining the SERP's Up Podcast to get some groovy insights around what's happening in SEO. I'm Mordy Oberstein, the head of SEO brand here at Wix. And I'm joined by the incredible, the absolutely unparalleled, uncompromising, but in a good way, head of SEO communications, Crystal Carter.
Crystal Carter:
Hello, internet people. I hope everyone's having a fantastic web-based, digital, listening-to-a-podcast day. I hope the sun is shining and that you have nice snacks wherever you're listening.
Mordy Oberstein:
Well, if you enjoy rain, I hope that it's raining, and you're cozy under a blanket.
Crystal Carter:
Some people do enjoy rain.
Mordy Oberstein:
Some people do.
Crystal Carter:
Some people do. My kid likes rain. It's raining, and he's like, "Yay." And I'm like, "Let's go home." And he's like, "No, it's fun." And I'm like-
Mordy Oberstein:
Splash-in-puddle time.
Crystal Carter:
Yeah, that's true. That's true. This is it. This is it. You got to take the rough with the smooth.
Mordy Oberstein:
Without thinking it through like, "Oh, now my shoes are soaked."
Crystal Carter:
Indeed, indeed. People don't gauge how deep the puddle is. That's also a risk, that sometimes you think, "Oh, that's a little," and then it's like a lake. It's like a chasm. It's like some sort of abyss, and you've just plunged yourself into it.
Mordy Oberstein:
So I don't mind, that's fine. Let the kids get wet, whatever. But then when they get muddy and the mud is sucked to the shoes, that's my limit. That's when I'm done. Now I've got to clean that thing. Forget it.
Crystal Carter:
It's a lot of work. It's a lot of work.
Mordy Oberstein:
The children you have. The SERP's Up Podcast is brought to you by Wix, where you can not only subscribe to our monthly newsletter Searchlight over at wix.com/seo/learn/newsletter, but where you can also grow your website if you're a lawyer. Insert many lawyer jokes here. Yep, LegalZoom and Wix have a partnership to help lawyers build their websites so that you can focus on what you do best. You might be asking yourself, "What does this have to do with me?" For this audience, probably not much, to be honest with you. But it's good to know anyway because it's got a lot to do with this week's topic, and that's user behavior and SEO. What does that have to do with lawyers? Well, we're going to explore the complex history and recent revelations around user behavior for SEO from Google's DOJ trial. Hence the Legalzoom thing. Now you get it? Now it makes sense.
Back to the intro. We're going to explore the complex history and recent revelations of Google's DOJ trial and maybe set the record straight about how user behavior impacts rankings, such as, what has Google said about user behavior and how it impacts rankings? What's the role of machine learning in user behavior and rankings? Have the revelations from Google's DOJ trial changed how we should look at user behavior and rankings? What Google has said about the leaks from the DOJ trial around user behavior and rankings? To help us sort out the complications of user behavior and rankings, Search Engine Journal's own Shelley Walsh weighs in. We'll also have a look at Google's PAA box and what it says about behavior. And of course, we have the snappiest of SEO news for you who you should be following on social media for more SEO awesomeness.
So tuck in your shirt, comb your hair, and be on your best behavior as a hard look at what's changed about user behavior and SEO is coming over for dinner tonight on this, the 71st episode of SERP's Up. Whoo, boy, I'm not feeling good, and that hurt my throat. Breaking the fourth wall.
Crystal Carter:
Breaking the fourth wall.
Mordy Oberstein:
I should have got a cup of tea. I don't do tea.
Crystal Carter:
I feel like, though, breaking the fourth wall is what this trial has been about. It's been really, really interesting.
Mordy Oberstein:
Oh, boy.
Crystal Carter:
There's a lot of juicy morsels coming out in the documents, and people have been watching it very, very carefully. And I think that it's some of the best information we've had-
Mordy Oberstein:
In a long time. Some of the coolest information, yeah.
Crystal Carter:
From Google about exactly some of the inner workings of how they do what they do.
Mordy Oberstein:
That said, it's a little bit of a hotbed of SEO controversy. So we hope, I hope at least, that as we go through this, we do not upset anybody on either side of this argument, but offer what's going on, offer our takes. But it's SEO at the end of the day, and it's just a healthy look at what's happening out there. So I guess we should take a look back at the history of all of this because ... Wait, even further, if this is your first gander at user behavior metrics, just so it's clear what we're talking about, we're talking about things like behaviors that users do on Google, like they click on a result, or how long Google stays on a page, like time on page, how many pages the user may have visited during that session. All this is tracked by Google Analytics or Google knows all of this.
Or maybe you went to a page and you said, "Wait a second, I don't want that." And you went back to Google and clicked on something else. What was the last click of that session on the surface? These are all behavior metrics that Google is monitoring, and the question is, does that impact the ranking? So really crazy case, a certain URL or a certain result on Google, it's a whole bunch of clicks over and over and over again. Does Google say, "Hey, I guess we got it right, let's keep this ranking really high?" That would be user behavior playing itself out for rankings. And Google has said for a long time, "No, we don't look at things like that," because it sounds really simple, but it is really complicated. Because maybe, for example, you clicked on a result from Google, you spent one second on the page and you bounce back to the SERP. Does that mean it's bad?
Or maybe you were Googling, "Is it raining outside?" Saw the answer is yes on the webpage. By the way, completely ignoring that Google tells you that on the SERP itself. Well, let's just pretend that Google didn't. You go to the webpage, it says, "It is raining outside," and you go back to do another search. That's called pogo sticking, by the way, in case you didn't know. Maybe that's good. Maybe that means the page was great. It told me the information really quickly. So user behavior metrics get really complicated really quickly, and they're messy, which is why Google's saying, "No, we don't use them." But SEOs have been saying for a long, long time, "No, no, no, Google has this data. They've got to be using this in some way." Bing says they do, which further puts that thumb on the scale, and maybe Google really is, despite the fact that they say no, we're not.
Now, where it gets really complicated is that Google has said, "Well, we kind of do indirectly." So for example, Google trains machine learning properties like RankBrain on, or with rather, user behavior. If a user clicks on a page, and let's make a really simple case everyone always use, it's a recipe, and you go to the web page, and the recipe has no image. You bounce and you go to a page with an image. The machine learning is now trained in the fact that if it's a recipe, it needs to have an image. We're not going to rank you. So oversimplifying it, but that would be user behavior impacting rank, albeit indirectly.
Crystal Carter:
So one of the things that was interesting in this document, there was a document that was shared from the files that was an internal document about search, and it was like, "It's a 2016 Q4 search all hands." And one of the things they said was something around that response. So they said in this document, "We don't understand documents. We fake it." This is slide five in the document, and it says-
Mordy Oberstein:
So a little bit of context. So Google is being sued by the Department of Justice. I'm not getting into any of that. All we're talking about is some of the testimony, I don't know if there were leaks or that it's been published, whatever. There have been information or statements from that trial about how search works that have been covered by Seach Engine Land, SE Roundtable and SEJ and so forth. So we're talking about a leak that comes from a 2016 Google PowerPoint. Sorry, now we have context.
Crystal Carter:
So in PowerPoint it says, "We don't understand the documents. We fake it Today, our ability to understand documents directly is minimal." I should also caveat that this is from 2016, and in internet terms between 2016 and 2023 is a vast amount of time.
Mordy Oberstein:
To give you context on that, RankBrain came out in 2015. That was Google's first machine learning property. So you're talking a year after RankBrain. That's a long time ago.
Crystal Carter:
And between that, we've had tons of other algorithms that have come up, BERT, MUM and ELMo, I think as well, in there.
Mordy Oberstein:
And then there's also a MOM, and MOM's not really integrated yet, but only a little bit. So there's been a whole bunch of really advanced machine learning properties.
Crystal Carter:
But in 2016 they said, "We watched how people react to documents and memorized their responses." And they said, "We look at people." So they said, "Beyond basic stuff, we hardly look at documents. We look at people. If a document gets a positive reaction, we figure it's good. If the reaction is negative, it's probably bad. Grossly simplified, this is the source of Google's magic." And I think the other thing that's important to remember is that between that time and now, Google's got a lot more data from mobile users, from Chrome users, for instance. So Chrome users, they'll have more documents, or sorry, more information about how users are using things when they're logged into lots of Google properties and when they're logged into lots of Google elements. So I think that that's important to think about as well. But one of the things that Danny Sullivan talked about at Brighton was that they've always paid attention to users, right?
Mordy Oberstein:
So yeah, that's that point. That Google has said for a long time they've always used user behavior metrics in this way to train machine learning. So yes, it does. But what they're saying is that it creates a profile around the content and around the intent of the user. So they're not saying the clicks or the user behavior, whatever the metric they're looking at, and the Google actually, because of the DOJ leaks, has shown some of the metrics they might be using, like first click, last click, scrolling, that kind of thing. They're not using it to say, "This page that you got clicked on will rank." What they'll do is they'll say, "What was it about this page that seemed to entice the click that the user wants? And would that apply to other similar pages? If yes, this is a new profile for this kind of query and this kind of page."
So it's creating a profile as opposed to saying the particular page will rank. So nothing's actually changed from the way Google's looking at this. They said, "Yeah." So when these leaks have come out, and there's been a whole bunch of them around user behavior, Google and what your target identity some like, "Yeah, that's nothing new. We've been saying this the whole time." SEOs have been, well, maybe not.
Crystal Carter:
Right. What I think is also interesting is that in this document they said, "We have to design interactions that also allow us to learn from users." And I think that this is interesting because we see a lot more filters on the search as well. So I think that I pay a lot of attention to SERP features, and certainly from a tech SEO point of view, being able to trigger SERP features and being able to get content to be more visible in SERP features is something that is very often a technical requirement. And that I find really, really interesting because those interactions can guide your content, your content creation, and the interactions that they create when they add more people also ask. When they add a filter, for example, when they add things like that, that gives us the guide to the kinds of signals and the kind of information that Google's looking for from users and for content.
Mordy Oberstein:
Ironically to your point, I'll try to pull it up here because I'm trying to remember exactly the phraseology, but that was, I believe, another leak from, I think it was Ben Gomes who said this. I don't remember exactly who it was, but there was a specific request to have more filters added to the SERP in order to incentivize users going to more result pages so that there'd be more ad impressions.
Crystal Carter:
So that's been very interesting about the whole thing. And I think that with regards to ads and search, if you talk to Google folks, they'll tell you that never the twain shall meet, the teams don't overlap that much between discussions of tactics and things like that. But having worked with a digital marketing team and a digital marketing agency where we did ads and where we also did search, the experience is different, but also you very often do see some overlap in terms of things.
One of the things that comes to mind at the top of my head is Title Tag-ageddon, if anyone else remembers that is when Google started changing all the title tags. And everyone was like, "They're changing my title tags. What are we doing?" Well, in the ad world, they've been doing that with dynamic search ads for years. So you would give them a URL, and then they would write the title tag for the ad. And they were able to do that for years before Title Tag-ageddon. And so I think it's really interesting when we see that and when they're also talking about that the two things are interplaying.
Mordy Oberstein:
So AJ Kohn has a nice piece going through some of the various outcomes or leaks from the trial, and one of his recommendations was, "Well, make sure you get the click with how you structure your title tags," which plays into your point before. If they're rewriting them, how do you do that? That becomes, if Google's focusing on user behavior, getting that click from the title tag or from proper implementation of the title tag from a CRO point of view, would seemingly be, for lack of a better word, a ranking factor, which I guess is where the pain point or the controversy really emerges. You have SEOs who have seen a lot of the talk around Google, and we'll link to the show notes Danny Goodwin from Search Engine Land covers extensively in multiple pieces. You can have a look at what's actually come out and what they're saying in specific around user behavior that we haven't covered here, because it's a lot.
But SEOs are looking at all of these statements that Google is saying about user behavior, which again, Google is saying, "Old, not new, we're talking about training machine learning properties to create content profiling." But SEOs are saying, "Well, wait a second, maybe this actually means that we need to be focusing more on actual user behavior metrics. We've been right this whole time. We told you Google was using user behavior metrics, and we've been right the whole time."
So for example, Cyrus Shepard, who has covered this really well on Twitter, so in a recent search engine journal piece, he said, "For predicting 2024 SEO trends," he's writing, "Well, I hate to say it, but in light of evidence pouring out from the US versus Google antitrust trial, it's become surprising and clear how much Google relies on user behavior data to shape actual web rankings." And he goes on. On the other side, just to present both sides of the argument before I weigh in, people like Glenn Gabe and even Barry Schwartz are saying nothing's really changed. Barry wrote, "The association between observed user behavior and search result quality is tenuous. We need lots of traffic to draw conclusions, and individual examples are difficult to interpret." And that's Barry quoting, I believe, Danny Sullivan, Google Search liaison.
Crystal Carter:
What it does for me is it brings up the old bounce rate argument. There's some people that are like, "Bounce rate doesn't matter. It's not important."
Mordy Oberstein:
That's the whole thing.
Crystal Carter:
There's some people who are like, "No, it's really important." And I think that all of those user engagement things, they do impact. And for instance, so site speed for instance, site speed is something that people say, "Oh, it's a ranking factor. It's not a ranking factor." But it's terrible for users. If I go to a website and it doesn't load, then I'm not going to go back. I'm not going to hang around. I don't want to sit there and wait for your page to load. I have other things to do. So I think that it's something that, I don't know, it's telling you things you already know, but I think it's also just making it more explicit, I guess.
Mordy Oberstein:
So I'll say two things on that. One is exactly that. From a practical point of view, you want users to click to your website, you want users to be happy on your website and stay on your website. So from a practical point of view, nothing should fundamentally change, even if Google is using user behavior metric, because that's what you're trying to do anyway, isn't it? Getting the user to click on the website and stay on the website and go to more pages on the website? So I'm like, "All right, great." I will say if you were to ask me, even to this day, if I really think that user behavior metrics are being used by Google in some of the ways, the very direct ways that some SEOs are saying, my personal opinion, and this is just me and many way smarter SEOs are going to argue with me, is no.
And I'll tell you why I think the answer is no. I think it's a chicken and the egg problem for me personally. So let's say you have 10 results that rank on Google. The top result is going to get the most clicks anyway. So now you'll say, "Fine, Google will look at last click." You went to the first result and said, "No, that's not great." And you scrolled down to the page and you went to the eighth result. So now Google will rank the eighth result number one. Make this deal, oversimplify this. What if the 30th result's really the best? What user behavior is Google going to use for that? Because no one's going to page three. No one's even going to page two. So you'll say, "Okay, Mordy, Google will run algorithm updates, unofficial updates where they do lots of testing, and they'll take something that was ranking number 30. And they'll test it at number five, and then they'll see if it gets clicks."
That's nice, but as someone who's been watching these trends super carefully for the better part of 10 years, most reversals like that last maybe, I don't know, two, three weeks at most. It's not enough time to train anything on anything, especially for most queries where the search volumes aren't astronomical. So you really don't have that data. So in which case, if you're going to use user behavior metrics, they're only going to help you with confirmation bias of what you already have in the top 10 to reshuffle the top 10. And even with the ranking shifts or the tests that Google's doing, I don't see how that's enough big of a dataset to what Barry quoted from Danny before to actually make an accurate decision. So why would they do that?
Crystal Carter:
So here's the thing. When I hear that, the thing that occurs to me is the importance of content diversity, and I've been talking about a lot of multimedia content at different conferences and stuff this year, partially because of this, because I think that Google puts more interactive elements on the page for people to interact with. So let's say you've got that one that's on ranking number 30. If the one that's ranking number 30 has a video, and I know there's been some stuff about video, we'll see how that all pans out in the next little while.
Mordy Oberstein:
No comment right now.
Crystal Carter:
Right. Or if it has an image or if it has some additional elements like, say, review schema where it's got little stars or something like that. Or let's say it's part of an eCommerce thing and it's got different filters, then Google's going to have more signals in order to measure that piece of content than it will on another piece of content that's maybe just text, for instance.
Mordy Oberstein:
That's true. And if you're using user behavior metrics, then sure, then that's certainly possible. But to apply that across the web, even with that dataset, to me, it still seems very, very limited because you still don't really know. Imagine all the pages that don't have a video that are there, so that wouldn't fall into that. And I think it just becomes very, very complicated to produce a lot of data. What is possible is what we said before, and what Google's been saying before the whole time, that I can look at what users are doing in the top 10 results, and I could say, "Hey, what's the profile here? What did the user really want? What did they get? What did they like, what didn't they like?"
And then I can apply that profile to pages that are ranking number 30 and say, "Hey, does this actually meet the profile? It does. Maybe let's move that up and see how it goes." So I think that what Google's been saying the whole time of using it to create a content profile for machine learning makes total sense because that does tell people what's ranking number 30.
Crystal Carter:
Right. And I think this also goes back to the real importance of content quality and keywords in a sophisticated sense, but the real importance of content quality is that if you've ever run a thing with machine learning that garbage in, garbage out, you have to give it good data. So in terms of Google understanding your content, you have to give it good data that you know what you're talking about. So the quality of your content should reflect something that machines can actually get into. And we'll have to make sure you have a coherent piece of content that makes sense from start to finish, making sure that you have headings that are well-indicated to show what is the content priority, making sure that you've got technical things in the background to also add more elements, entities, et cetera, all of that sort of stuff. That's the real goal is to give them enough information about your content in order to make an informed decision about how people might interact with it. And the more signals that they have, then the easier it is for the bots to figure it out.
Mordy Oberstein:
Now, leaving aside your opinion of user behavior metrics, and again, I gave you my take, there's many takes, make your own decision. I don't want to put my thumb too heavy on the scale. I don't want to upset anybody. One person who has been doing a great job covering not just the DOJ but are ranking factors and what is a ranking factor, what's not a ranking factor for a good while now has been Search Engine journalist, Shelley Walsh. So we asked her, "Hey, you recently covered Danny Sullivan saying that Google's longstanding advice about how to approach things," like we said, "hasn't changed much, but some are saying that the DOJ's lawsuit against Google changed the way they see the role of user behavior in the algorithm. What's your take?" So here's Shelley's take.
Shelley Walsh:
What the antitrust lawsuit unearthed was that Google is measuring user interactions in four ways, of a mouse movement, clicks, scrolls, and whether the user enters a new query. This highlights how much emphasis Google does put on user signals. And why does Google do this? Because they know that the user is central to everything. So this is exactly what Google has been saying long-term, make pages for users and not search engines. The secret of SEO and certain visibility on Google is about building everything around the user. So if you can understand that SEO is structured around making it as frictionless as possible for a user to do what you want them to, then you understand the fundamental concept of how SEO works. And if you understand how a website works, how a search engine works and how users behave online, then connecting the user to the action is common sense.
It's not complicated, but it's really hard to do this well. What underlines everything is a need to understand the intent of the user and then to provide the best experience. As I said in the article I wrote about Danny Sullivan, this is based on how easily the page can be accessed, how directly the page answers the query, how comprehensively the page answers the query, and how intuitively the page is structured. This isn't to say that if you build a page, they will come. Domain, brand authority, topical relevance, user signals and links all have an influence on ranking, which is all reflected in the antitrust lawsuit. So there's no one-size-fits-all approach to SEO, and this is what makes it such an exciting challenging space to work in.
Think about what is a user's motivation and make it as easy as possible for them to consume that information. And if you're thinking about all the things I just said, then you are naturally going to encourage a reader to click on your link in the SERP, scroll down your page to read more and less likely to search again around the same query because they are satisfied. And this is all aligned with what the antitrust lawsuit has highlighted.
I think there are many SEOs out there that have understood this for a long time, but it hasn't been widely discussed. So rather contrary to this being a new approach, actually it's just confirmed what a lot of SEOs knew all along. So what Danny has said with make pages for users, not search engines, is aligned with how good SEO has been practiced for years. But what I will say is in this age of rapid change, now more than ever it's definitely time to focus on what's timeless and that is putting the user front and center. So I hope this really answers the question of what has been suggested has changed about user signals hasn't really changed at all.
Mordy Oberstein:
Thank you so much, Shelley. Make sure you give Shelley a follow over on Twitter at The Shelley Walsh, that's the S-H-E-L-L-E-Y, Walsh, W-A-L-S-H, link in the show notes. It's interesting. It's just funny to me to watch SEOs. I kind of feel a struggle with this whole thing. On the one hand, Danny selling one thing, the DOJ leaks maybe imply another thing. Is it really confirming that SEO's new the whole time? Is Google saying what they're saying that nothing's really changed, really they're accurate? But her point about just it doesn't matter to a certain extent is spot-on in a way. You want to do the things anyway, like we said?
Crystal Carter:
Yeah, I think people are hoping it would be juicier. I feel like we were hoping it would be like, "Yeah, what you really got to do is this or this."
Mordy Oberstein:
That's the God's honest truth.
Crystal Carter:
I think people were really hoping it would be a bit juicier, but I think that, yeah, I said it tells you what you already knew before, and she said the same thing.
Mordy Oberstein:
That's what I mean. I feel like if you were already in the camp that Google's using user behavior metrics, looking at clicks and time spent on page and pogo sticking and whatever, you're like, "Oh yeah, this is the proof right here." And if you are already in the camp like myself, if they're not using it, you're like, "Well, I don't see anything that's really indicative of saying that they are using it in the way that you're saying." So at the end of the day, nothing's changed.
Crystal Carter:
I think it's just the folks that struggle with the, this is a ranking factor and this is not a ranking factor and all of that sort of stuff. Chillax with that.
Mordy Oberstein:
Never. No.
Crystal Carter:
Pay attention to how users interact with your content. Pay attention to what's on the SERP because that also affects how users interact with your content. Pay attention to the quality of your content because that's what Google's paying attention to. Because I think one of the other things, there's a great article on SE Roundtable from Barry Schwartz, Kelsa Priest, and it says that the three pillars of ranking according to the DOJ documents, and it says the body, what the document says about itself. That's not a surprise. And what it actually says, the anchors. What the web says about the document, that's links from other websites, links within the document, links within the web page itself and user interactions, what users say about the documents based on their interactions. So these can include clicks, attention on a result, swipes on carousels, entering a new query.
Again, I talk about media a lot, but one of the great things about having multimedia on your webpages is there's more parts of the webpage for people to interact with on the SERP, for instance. So if you have a recipe, for instance, there's the reviews, there are the images, there's videos, and then there's the video tab, and then there's different elements of that as well. So if Google's seeing that lots of people are interacting with your content in lots of different ways, then that's going to give them more signals, and that all works really well.
I think also it brings to mind the real importance of long-tail with that sort of thing, because with long-tail queries, if it's that specific, users are more likely to actually click on it, because it's so specific, because it's so niche to exactly what they need. And so you're less likely to have competition with something that's been ranking for a very long time and has a lot of user data within Google's dataset. You're more likely to be connecting with something that has a new set of user data, particularly if it's niche and trending and all that sort of stuff, so you want to be-
Mordy Oberstein:
Anyway, if you're trying to target something niche, you need to be targeted and specific, otherwise they're not going to get the click.
Crystal Carter:
Yes, these are things we knew before.
Mordy Oberstein:
That's how it works, regardless of what Google's doing with it, which I think is nothing. Not nothing, I don't think it's direct. Sorry. Anyway, I don't want to stir the pot anymore on this. I feel like I've stirred the pot on this very ... If you're not in the SEO space, this is a very controversial topic, and I'm sure there are a bunch of SEOs muttering, "Mordy, you're wrong. You're just so wrong. You're dead wrong. I'm going to find you."
Crystal Carter:
Going to get you, get you, get you, get you.
Mordy Oberstein:
One way or another, we need to pivot. And what I wanted to do was take a look at how Google understands the concept of user behavior. So what I did was I went and I Googled "user behavior metrics DOJ trial," and I wanted to look at the PAA box. So guess what we're going to do now?
Crystal Carter:
Fun with people also ask?
Mordy Oberstein:
Fun with people also. We're going to have some fun. I know, Barry, fun with people also ask. So I Googled user behavior metrics DOJ trial, and it's on topic because that's what we were talking about before. And what I got back was, so the PAA box, the four questions that Google's asking you can expand and get an answer to, and a link to a website. So the people also ask box could be four initial questions. They are, what are the Filip factors? I don't know who Filip is, and I'm not sure why it's spelled wrong because they spell it with an F but Philip is spelled P-H. Anyway, why Google is facing DOJ in first major tech monopoly trial. Third question, what is the Google monopoly trial? And how is Google a monopoly company? I'll do one. Okay, wait.
Crystal Carter:
Can I be the thimble?
Mordy Oberstein:
Exactly. Yeah, because I want to be the car. I like the car. That's a kickass car, by the way.
Crystal Carter:
It's very good. That's a good car.
Mordy Oberstein:
I hope they never change that in the new sets. Maybe they have. Great, okay.
Crystal Carter:
You can get different sets, I think. We've got a Star Wars one. You can be the Millennium Falcon.
Mordy Oberstein:
Oh, that's cool. We have another one with dinosaurs, I think, somewhere.
Crystal Carter:
I want a dinosaur Monopoly.
Mordy Oberstein:
I could be making that up, by the way. I ran another query. It's like something was missing from this. Let's play a game. What's missing from the people also ask box? I read another one and it said "SEO implications, Google DOJ trial." And the four questions I got back from that people also ask is, why is Google being taken to court? What is the federal case against Google? What were the results of the Google trial? And what practices are Google being accused of? What's missing?
Crystal Carter:
Anything to do with SEO.
Mordy Oberstein:
Yes, nothing with SEO. And in the first one, user behavior metrics, DOJ trial, there was nothing about user behavior metrics.
Crystal Carter:
Right.
Mordy Oberstein:
Fascinating.
Crystal Carter:
Right?
Mordy Oberstein:
Because now we're entering in the entity zone. Cue Twilight Zone music thing. What I think is going on here is that you have a larger entity, the DOJ trial, and a very, very niche entity, in one case SEO and in the other case user behavior metrics. And Google's like, "Forget that sub-entity, that's too niche. Let's just go with what we think you're really here for, which is the main entity, the DOJ trial. Here's four questions about the DOJ trial."
Crystal Carter:
Right. The other thing is, I don't know, sometimes I see this they've got CEO instead of SEO. Sometimes they think you've misspelled it. I also wonder how much they know exactly about it. But what I found is, why is Google in trouble with the US Department of Justice, and why is it bad is coming from Search Engine Land? Even though they are not giving you the direct reflection of your query in terms of keywords, they're giving you a source with an entity that they know to be related to SEO.
Mordy Oberstein:
Well, that's interesting. I wonder if we Googled it without the SEO or just Googled that question if you would get Search Engine Land showing up there.
Crystal Carter:
So I took out SEO implications of Google DOJ trial, and then I've got time showing up in it, in the PAA, and then still Search Engine Land is showing up in there. But it's interesting. But also there's also, Google knows that I just searched that. That wasn't a fresh search. That was a follow-on search, so Google has some more data on what I've been searching.
Mordy Oberstein:
They are loving search engine land for all these kinds of queries, it looks like.
Crystal Carter:
But I think also there's been a lot of SEOs who've been covering it, for instance, Covino.
Mordy Oberstein:
I know. One of the biggest places that it's been covered. But it is interesting, because I wonder if as an entity it says, I wonder if in the background, because what you're seeing on the SERP itself has no actual connection to SEO. You're not seeing the term SEO reflected in the four questions that Google's offering. But I wonder if what's shown on the SERP is not the full knowledge graph. Just because someone, for example, doesn't have a knowledge panel doesn't mean they're not in the knowledge graph. Just Google's not showing a knowledge panel or that person, let's say. It could be that in the background, Google knows there's an SEO subtopic or a subset around the DOJ trial, and it's saying, "Search Engine Land is very much a related entity to the trial because of the whole SEO connection." So we might not be showing that in the actual questions per se in the PAA box, but within the results that they're showing, it could be that the reason why Search Engine Land is doing so well within the PAA box for these kind of queries is because of the entity connections made.
Crystal Carter:
And I think also to come back to the user behavior conversation. SEOs have been very, very active and very, very fascinated and very, very engaged in this particular trial, because there's been so many leaks and so much information about search that we've never seen before directly from Google. And so I think that going back to the user behavior metrics, they used to talk about no follow links, no follow links and social no follow links, but Google have said, "We pay attention to social signals. If we're seeing that a piece of content is getting a lot of traction on social, then we will take that into account when we're thinking about whether or not a piece of content is good," because they also crawl and index social media accounts.
Mordy Oberstein:
For sure. Or crawling, for sure.
Crystal Carter:
So that goes back to the user behavior thing. It also goes back to when we were saying about the three pillars of SEO. According to the DOJ trial, they said about what the web is saying about it as well. Again, if they know that SEOs are talking about it and they know that Search Engine Land has been covering this a lot, then they'll use that to bring that into whether or not they show it in the PAA.
Mordy Oberstein:
Now, if you want to keep up on what's going on with the DOJ trial, you might want to head over to Search Engine Land where you'll often find Barry Schwartz covering the SEO news. But not the DOJ stuff, that's really been Danny Goodwin. Maybe we'll have Danny Goodwin in the news this week and not Barry? Nah. Maybe we'll have Danny also possibly, but definitely Barry. So here's this week's Snappy SEO News.
Snappy News. Snappy News. Snappy News. Three articles for you this week, one from each of the major SEO news publications. This one from Search Engine Journal's Matt Southern, Google Circle to Search, the New AI-Powered Search Gesture for Android. So Google has launched, on certain Android devices, this, I think, very unique, very cool way of accenting your search. I'll read what Matt wrote here. "A new gesture-based search method, Circle to Search, Google has developed a new feature for Android called Circle to Search that will change how people interact with content on their phones. With Circle to Search, users can," listen to this, "circle, highlight or tap text images or videos within apps to instantly search for related information without switching between applications." So basically you see, I don't know, you're on whatever app you're on the Google app, and a picture shows up of someone wearing really cool boots that you want.
You circle the boots, and Google shows up in overlay with places where you can buy these boots. It's very cool. It's running. It's on only certain Android devices. It's launching, they say, globally on January 31st on the newest Pixel 8 phones, Pixel 8 Pro and Samsung's recently released Galaxy S24 series devices. Now, what I think is interesting about this is, I've been talking about this for a long time, the way we search, even with the SERP features like knowledge panels and also knowledge cards and people also ask and whatever, whatever search features. And the 10 blue links is not really, I feel, how people want to actually engage with search.
They really want to be able to explore, go down different rabbit holes, go down different pathways, come out of those rabbit holes and go down a different one really, really seamlessly. And things like this, things like the way Google has shown or demoed its Gemini product and how it shows custom results for different types of information queries points to that idea that people do want a more immersive way of searching. I think keeping your eye on these sort of trends is very important.
Next up from he who is Barry Schwartz over at Search Engine Roundtable, Google Ranking Teetering In and Out Over Weekends, Bug or Edge of Quality. There's been this chatter around the SEO industry that certain TLDs have been coming in and out of the rankings each and every weekend, so TLDs meaning dot-com, dot-org. But in this case you're more obscure ones like dot-media, dot-clinic, or whatever it is. Folks have seen in their search console data that the websites are losing rankings and traffic and impressions, whatever it is, however we want to look at it, whatever metric, each and every weekend.
Google said there's no real bug going on here. However, as Barry points out in the article, it may have to do with the sites being what's called on the edge of indexing. That sounds so dramatic. I'm on the edge of indexing, meaning they're on the edge of quality, where Google's not sure whether or not to really keep or not keep the page in and out of the index and therefore rankings. The pattern does align to some volatility out there in the rankings that are being picked up by the SEO weather tool.
So on the one hand you have a lot of SEOs saying, "Hey, it looks like there's some kind of bug going on here." Google's saying, "No, no, no. No bug here, but it could be that you just don't have the quality that you need." It is super, super weird. What would have to happen is you'd have to see if the same pattern is happening across all other kind of TLDs, like your more traditional TLDs, like your dot-orgs, dot-coms, dot, I don't know, dot-co-dot-uk kind of thing, whatever it is, all those your usual typical TLDs, if the same thing is happening or not. And then you would see, that's just happening across the web. It's the edge of quality, the edge of indexing. Or if not, if it's just those more obscure TLDs and it is some sort of issue, don't know yet officially. So maybe weigh in out there on the social media. You can weigh in where Barry has the forms and the link to the article on Search Engine Roundtable. Link to that in the show notes.
Anyway, last up from, again, Barry Schwartz, but this time on the Search Engine Land. Barry here, Barry there, Barry is everywhere. "New research study asks if Google search is getting worse." The study that came out of Germany, I believe that took a look at review content, so product review content. Best microwaves and reviews of the best microwaves and whether or not from Google to Bing to DuckDuckGo, the search quality has improved. They looked at the results, they scraped the results for the entire course of a year to see whether or not the results are quality, not quality, if they're overly SEO'd or not.
I found the results and the study overall interesting. A few takeaways from my end. They say that results are getting worse, but when they show the top spammy review spam websites, there are not very many in the list, and they list four of them, I think, in their chart there, two of them are Amazon, and one of them is YouTube, and then one of them is a legitimate review spam website. So I thought that was a little bit weird. They do say that it's a whack-a-mole kind of thing, that when Google has these updates, it does do a good job of getting rid of this affiliate review spam content from the research results, but they come back. I'm not so sure about that. Glenn Gabe has shown over time that a lot of these websites get killed off.
Glenn also did point out, by the way, that the study was finished before a lot of the more recent updates have officially come out. One point that I thought was interesting was they talk about the level of content quality in the top ranking result pages for these product review queries. They say, by the way, that the quality of the content is not so great. I find that interesting. Now, the study conflates SEO with bad SEO, meaning there's SEO that's what we do, which is legitimate, great, good stuff that helps websites target their content more, that helps websites be found more, that helps websites speak to users and search engines in a much more, quote, unquote, better way with link spam, review spam, spam, spam, spam SEO.
So it doesn't make a distinction in the study, which really irks me a little bit, but it does talk about the level of quality in the content at the top of the SERP not being so great, and I think that is an issue that we should talk about and discuss. We have discussed it. I'm not going to go into it in too great of length here. I talk about it in my article in the SEO Hub where I talk about the future of content, and we also spoke a little bit about it on our recent podcast with Jason Dodge and Nigel Stevens about emerging content trends. I'll link to those articles and podcasts in the show notes.
But I do think that sometimes we, in the SEO world, our idea of what good content is and what good content actually is is a problem. I do think though there's an incentive problem, which I speak about in the article, which we talked about in the podcast, where is Google giving enough incentive for content writers and SEOs to abandon the typical content quality models and go to a little bit more of a nuanced model? What I mean is when you start, let's say you're reviewing best microwaves, so SEOs will be like, "All right. First thing we do is write what is a microwave?" Maybe get rid of that and be a little bit more nuanced in the content.
I think it's a little bit of SEOs needing to adapt a little bit, and I think it's also a little bit of a Google trying to or should be showing results that are not doing that in the results a little bit more to cue the content and SEO industries to be, "Hey, you don't need to do that anymore." Because it is a little bit of a difficulty saying, "Hey, let's not do that anymore." But we do see that in the results, so why wouldn't I do that anymore if Google is ranking that?
So it's up to everybody to do a little bit of a better job in offering a little bit better of a content experience, in my personal humble opinion. Now, I will address this here very quickly. There has been controversy around the search results being less quality overall, which is why this article is getting so much coverage because it is part of a larger narrative. I will say, I think that, when I say we I mean the entire web, underestimates the paradigm shift that has happened with content. Again, I speak about this in my article. We spoke of it on the recent podcast interview with Nigel and with Jason, but I do think we underestimate just how much the content sans have shifted and how difficult it is for a search engine like Google to adapt and to realign with user expectations and what the content picture and scenario actually is out there.
So I do think yes, you're going to see some not great results coming out there near the top of the surface certain times or over-reliance, we'll say, in forums or this or that, but I think it's all par for the course. It's not easy to adjust. It's not easy to realign results to what's happening out there on the web at the same time. And it will take Google a little bit of time to perhaps get it right, but I do think Google will fundamentally get it right. I'm a believer. I'm an optimist. As much as it may surprise you, I am an optimist. And with that, that is this week's Snappy News. Thanks again, Barry. We appreciate you. You're the man. You're the man, Barry.
Crystal Carter:
Barry's great and Barry listens every week, and we really appreciate you listening every week. He's just tweeted us something that I haven't told Mordy about that I'm curious as to exactly what we asked.
Mordy Oberstein:
I saw that pop up on my phone while we were recording. I saw something like, "Oh yeah, @Mordy overseen Crystal on the web. So awkward."
Crystal Carter:
What are you talking about?
Mordy Oberstein:
Himself. He's talking about himself.
Crystal Carter:
We love you anyway, Barry. It's all good.
Mordy Oberstein:
But now we mentioned the L word. We're making him really feel awkward now.
Crystal Carter:
Oh, we adore you, Barry. We think you're so fantastic. You are-
Mordy Oberstein:
I do.
Crystal Carter:
... just an absolute gem and a gift.
Mordy Oberstein:
I have a little stuffed pillow like Barry at night with a string on it. It reads news titles, SEO news titles with grammatical errors in them to me. Anyway, from trolling Barry just a bit on the official Wix SEO podcast, which is a whole new level of trolling, to be honest with you. By the way, make sure if you're not following Barry Schwartz, make sure you follow Barry Schwartz on whatever social media platform you consume @rustybrick on Twitter.
Crystal Carter:
Who isn't following Barry Schwartz? Follow Barry.
Mordy Oberstein:
I don't know. Sometimes I see Twitter people with a million followers and Barry only has 250,000. So those people.
Crystal Carter:
You should follow Barry.
Mordy Oberstein:
Obviously, but you know who you should also follow, which speaks to Barry because Barry often quotes him. I'll give you a hint. He's often quoted by Barry. He has two Twitter accounts and he loves Star Trek. He'll often go on Star Trek cruises.
Crystal Carter:
Oh yes, yes, I know exactly who you're talking about now. It is the one, the only Danny Sullivan.
Mordy Oberstein:
Danny Sullivan, what? We're not recommending you follow Danny's personal account, which you should follow, but we're also recommending you follow Danny's official account as Google search liaison over at SearchLiaison, don't ask me how to spell liaison, over on Twitter. I do know how to spell liaison. It used to be in my title L-I-A-S-I-O-N, liaison. I know French.
Crystal Carter:
Were you in the spelling bee?
Mordy Oberstein:
No, no. I know how to spell one complicated word. I'm the worst at spelling.
Crystal Carter:
I did a spelling bee, and I remember I got ukulele.
Mordy Oberstein:
That's a hard one.
Crystal Carter:
I was like, "That's Hawaiian." And they were like, "You still have to spell it. It's in the dictionary." And I was like, "Oh, no." So that didn't work out well for me.
Mordy Oberstein:
I've been to spelling bees, not for long though. "Mr. Oberstein, have a seat. You're embarrassing the entire school." Anyway, make sure you follow Danny because he's officially tweeting official stuff about official Google ongoing updates and all sorts of changes over on the SearchLiaison account. It's a super important account to follow.
Crystal Carter:
I think he spoke at Brighton in November, and it was fantastic, and he spent a lot of time trying to explain why he has to do so much explaining about search documentation and how they're really thinking about making things clear when they can. And they're really responsive, so he's really responsive to that stuff. If people message him and they're like, "This doesn't make sense," he tries to make it more clear and tries to answer questions as much as he can. Super nice guy and absolutely follow him for really, really interesting information about search.
Mordy Oberstein:
Your SEO life will live long and prosper if you follow Danny's account.
Crystal Carter:
Indeed.
Mordy Oberstein:
Indeed. You know what's lived long and prospered? This podcast episode.
Crystal Carter:
It did. It was good. I enjoyed it.
Mordy Oberstein:
That's a very Vulcan answered. It was good. I enjoyed it. Green-blooded. I got my kids into Star Trek recently.
Crystal Carter:
Yeah?
Mordy Oberstein:
Yeah.
Crystal Carter:
Resistance is futile.
Mordy Oberstein:
Their future partners and spouses, whatever, will blame me. You have me to blame for the Star Trek.
Crystal Carter:
It's fine. They'll just meet a nice Trekkie. It'll be fine.
Mordy Oberstein:
I hope so, for their sake.
Crystal Carter:
Trekkies are nice people. It's all good.
Mordy Oberstein:
I died alone because my father introduced me to Star Trek.
Crystal Carter:
As long as they watch Galaxy Quest as well.
Mordy Oberstein:
Oh, great movie. Great movie. You know what else is going to be great? The next episode of the podcast. Thanks for joining us on the SERP's Up Podcast. Are you going to miss us? Not to worry. We're back next week with a new episode as we dive into structured data markup do's and don'ts. Look for it wherever you can consume your podcast or on the Wix SEO learning Hub at wix.com/seo/learn. Looking to learn more about SEO? Check out all the great content and webinars on the Wix SEO Learning Hub at you guessed it, wix.com/seo/learn. Don't forget to give us your review on iTunes or a rating on Spotify. Until next time, peace, love, and SEO.