Learning to use SEO data studies the right way
What should you know about an SEO study before making conclusions? Why is understanding the methodology of an SEO important? How do you share complex data with clients and stakeholders?
Wix’s Mordy Oberstein and Crystal Carter, alongside SEO consultant Jess Joyce, untangle the web of data imperfections in SEO studies.
Bring a skeptical eye when reviewing an SEO study. A lot goes into creating a valid study with repeatable, statistically relevant results. Understanding the methodology and the limitations of an SEO study are can help you understand what’s really being presented.
Join us as we chew on some “crunchy” data perspectives in this episode of the SERP’s Up SEO Podcast.
Episode 112
|
November 27, 2024 | 43 MIN
This week’s guests
Jess Joyce
Jess is a Toronto SEO Consultant (Search Engine Optimization), & a web developer who’s worked with companies including Mashable, Fast Company, Honda, CIBC, Budweiser, Pfizer and for companies from startups to large agencies, optimizing the web for over 20 years now.
Notes
Hosts, Guests, & Featured People:
Resources:
It's New: Daily SEO News Series
Almost Half of GSC Clicks Go to Hidden Terms - A Study by Ahrefs
Google’s Product Reviews Algorithm Update: Winners & Losers
Charting 10 Years of The Google Algorithm
News:
Google Search Console Finally Drops The Page Experience Report
Google site reputation abuse policy now includes first-party involvement or oversight of content
Google’s site reputation abuse policy is a band-aid for a bullet wound
Notes
Hosts, Guests, & Featured People:
Resources:
It's New: Daily SEO News Series
Almost Half of GSC Clicks Go to Hidden Terms - A Study by Ahrefs
Google’s Product Reviews Algorithm Update: Winners & Losers
Charting 10 Years of The Google Algorithm
News:
Google Search Console Finally Drops The Page Experience Report
Google site reputation abuse policy now includes first-party involvement or oversight of content
Google’s site reputation abuse policy is a band-aid for a bullet wound
Transcript
Mordy Oberstein:
It's the new wave of SEO podcasting. Welcome to SERP's Up. Aloha. Mahalo for joining this SERP's Up podcast. We're serving up some groovy new insights around what's happening in SEO around what's happening in SEO. I'm Mordy Oberstein the head of SEO brand here at Wix. And I'm joined by the very deeply thorough, data-driven, data dynamic, data diving head of SEO communications here at Wix, Crystal Carter who's shaking her head like what are you even saying at this point? I don't know.
Crystal Carter:
I don't know what he's talking about, man. Like data? Data shmada. I'm winging it, man.
Mordy Oberstein:
I'm just trying to tie the topic into your adjectives.
Crystal Carter:
I'm winging it. Aren't we all?
Mordy Oberstein:
Oh, I've been winging it for 40 years.
Crystal Carter:
Never did me any harm, as they say. No.
Mordy Oberstein:
Cholesterol is too high? Eat more of it. Data.
Crystal Carter:
No, he's not my favorite Next Gen character. You know what? I think over the years it's probably, I mean there's Picard obviously, but I think it's probably going to be Riker's definitely higher up there just because of the idea of Riker-ing over a chair, which I didn't realize until I saw the super cut and I was like, "This is amazing. How did I never see it?"
Mordy Oberstein:
When you're that tall, that's what you would do.
Crystal Carter:
And then once you see it, you can't unsee it. It's perfect. Although I think that the actor who plays Data has had some of the widest range, but then there was also... Is it Wil Wheaton or whatever?
Mordy Oberstein:
Yeah. Oh, now you'd open up a Pandora's box with Wesley Crusher. Okay.
Crystal Carter:
Right. Wesley Crusher, just his second act on the Big Bang Theory. I was not expecting that.
Mordy Oberstein:
Right.
Crystal Carter:
So there's that. But then I guess also Whoopi Goldberg. Whoopi Goldberg has happened in recent career. So anyway, so yeah, there we go.
Mordy Oberstein:
Wait, how we get from data into Star Trek?
Crystal Carter:
Data.
Mordy Oberstein:
Oh, Data. Oh, that went right over my head. Wow. Wow. I'm like, I'm all down to talk about Star Trek. I just didn't understand how we got here Data. Oh my, wow. I really warfed that.
Crystal Carter:
You know what we should do this podcast? Make it so
Mordy Oberstein:
I'm engaged. The SERP's Up podcast is brought to you by Wix Studio where you can not only subscribe to our monthly newsletter Searchlight and check out our SEO course over on the Wix studio SEO Learning Hub. But where you can also get tons of deep dives and deep takes across all of the content that the Wix Studio SEO Hub has to offer. As today, we take a hard look at what you should know about an SEO study before drawing any conclusions. While understanding the methodology of a study is huge, the data mindset and how to avoid its pitfalls and know they who wrote the study, why authorship comes into focus. SEO consultant extraordinaire, Jess Joyce will share how she shares complex data with clients and stakeholders.
Plus, we'll pick apart a few SEO studies to see what lessons we learn from them and how we approach them. And of course, we have your snappiest of SEO News and who you should be following on social media for more SEO awesomeness. So like Mike Tyson in a boxing ring, we've got something for you to chew on and it's crunchy as we help you better understand the SEO data crunchers of the world on this 112 episode of SERP's Up.
Crystal Carter:
That's quite the intro. Quite the intro.
Mordy Oberstein:
Well, now that I did, I was thinking, what do I do with the data thing? Oh, you're chewing on numbers or crunching on numbers. Okay, I'll go Mike Tyson chewing on people's ears.
Crystal Carter:
Yep.
Mordy Oberstein:
I should have went Data Star Trek and it is just whoosh.
Crystal Carter:
Hey, we're going to forge ahead just like Jordy. It'll be fine.
Right.
Mordy Oberstein:
Wow.
Crystal Carter:
Okay. So.
Mordy Oberstein:
I've got some cues for you, but anyway.
Crystal Carter:
Oh, I recognize that reference. Okay. Right. Okay. So resistance is futile when it comes to data studies. Basically, when we're working in SEO, we are inundated with data all the time, client data, market data, all sorts of stuff. But how do you sift through all of it? How do you make sense of everything? I know that this is something that Mordy has worked loads on and has written on extensively, so we're going to get into the discussion very quickly. But just as a primer, I tend to think of these studies in three different categories, and I think that they should really be assessed for the value in these different ways as well. So to my mind, SEO data studies tend to fall into a few categories. You have your case studies, which are looking at outcomes for a particular client or project or instance or something like that.
Then you have your data study that's looking at large scale data. So for instance, when you're looking at the impact of an algorithm over an entire set of keywords, et cetera, et cetera. And then you have your targeted data study, which is a little bit of a combination of the two. And with case studies, these tend to be centered on the client outcomes. And these mean that there's a particular set of circumstances. So if you're trying to find out whether or not this is valuable for your project and whether or not this is relevant to you, it's very important that you understand exactly what the parameters were of the case study, and they can be incredibly valuable. If you're on a similar vertical, and let's say you're working on, I don't know, a project and it's a retail client, and they sell the same kind of-
Mordy Oberstein:
Selling Star Trek T-shirts.
Crystal Carter:
Selling Star Trek T-shirts, right? For Comic-Con or something, right? So let's say that you're that product vertical and there's somebody else who did a case study for the same kind of industry, then you are very likely to take a lot of learnings from that, right? You're facing similar challenges and they can help you figure out how to boldly go where you want to go. Now, it is worth though, bringing in a little bit of skepticism when you're looking at case studies because people don't share everything. They can't. They can't, they couldn't possibly. And they don't share all of the things that go into it. So it could be that they did this project and it was all SEO and nothing else was involved. Or it could be that yes, there was SEO involved and of course it contributed to it, but there was also 20,000 pounds worth of paid put on it per day or something like that.
Or it could have been that there was also a digital PR campaign that they didn't mention. Or it could be that there were various other things behind it. It could be that somebody was on TV. I've seen incredible SEO lifts because somebody was on television and had a big PR spike. So that is something that's worth thinking about. Yes, there's probably some learnings that you can take from a case study, but don't always expect exactly the same results if you implement the same things that they're advocating for, because it's impossible to know everything that went into that particular instance. That is a very, very big, it depends in that particular case. And I can see you nodding your head, Mordy. I don't know if you want to jump in on-
Mordy Oberstein:
No, no, no, no. You're on a roll. You're going, I'm with you. I'm with all hockey stick growth in my case study. Okay, okay.
Crystal Carter:
Right. Right. Right. And yeah, they can't show you all of the things. And sometimes one of the big things that can vary, particularly when I look at case studies, is the market. So sometimes in emerging markets you'll see people and they're like, yeah, we did this particular thing and we saw 7000% growth. And then if you tried to do the same thing in an established market like the US or the UK or France, Germany, et cetera, then you might have a challenge making the same kind of impact because there are more competitors in the mix. And so that can be a real challenge as well. It's not to undermine what the folks who are getting those gains have done. Well done, fantastic and well done. But it can sometimes be a challenge to do the same thing in a more established, more competitive market.
So that's something to consider with case studies. So pay attention to them, but don't necessarily expect the same results. Data studies tend to come from folks with large data sets to work from and within the SEO space, a lot of times this comes from tools teams. So this will come from folks like Ahrefs, from StatMoz, SE Ranking, Semrush, and lots of folks are doing great data studies here and they provide a great amount of data for algorithm updates in particular, or when certain SERP features emerge, and they can say, this SERP feature is visible. So I've certainly paid a lot of attention to Dr. Pete will be like, oh, featured snippets have gone up 15% or they've dropped 20%, or things like that. And they're able to give you a broad understanding of what's going on around the web. These are really, really interesting.
The challenge with these studies is that it's important to remember that there's limitations that are based on the tool. So sometimes tools providers will have a certain data set that they're looking at, or they will have certain features that they're able to see that maybe others aren't able to see, or maybe there's a certain feature that they don't see, or maybe there's certain markets that they're not looking at. So for instance, I spend a lot of time looking at Semrush. Semrush doesn't cover every market. It's a great, fantastic, wonderful tool, but it doesn't cover every market. So if they're less strong in your market, then it might be less easy to see what's going on there. And if they don't store data on a region that you're working in, for instance, then the data set might be slightly less relevant for you.
So it could still have an impact, but it might not be a like-for-like relevancy for you. And it doesn't mean that you should ignore the data. It's worth carrying out your own experiments to see if you can replicate the findings. Absolutely. But you need to think about assessing your own data set and coming up with your own set of benchmarks for when things are changing or where there's visibility changes or which competitors tend to have the same impact as you and those sorts of things. It's worth looking at that when you're comparing your own results with large-scale data studies.
Mordy Oberstein:
The data, just a peek behind the scenes on the big data studies, they are very difficult and you need to pay attention to what they write. Usually they do at the beginning, sometimes at the end, sometimes not at all. If it's not there at all, run. What the methodology of the study was. What kind of keywords did they use? For example, let's say, I don't know, you have, just forget a study for a second. You have the SEO weather tools. So you have MozCast, and let's say you have the Semrush sensor, and they'll often show different things and it's because they're built on different types of keywords. Moz uses high search volume keywords on purpose because they think that that simulates being more like your average user on the average day. Whereas Semrush is looking at normalized keywords to make it look like, okay, this is the web as a whole.
Both make sense. They're just both very different. So you have to look at the types of how many keywords, what types of keywords. It's very easy to get lost in the verbiage too. Oh, we looked at 5 million data points. What you might have looked at was not 5 million keywords. You might have looked at one keyword 5 million times, for 5 million days, which is a different kind of data than looking at a hundred thousand keywords for a hundred thousand days.
Crystal Carter:
Right.
Mordy Oberstein:
It's very confusing. They're not always very transparent about how they on purpose. And also it's easy to get this data wrong. I cannot tell you, and big kudos to the Semrush team for this, how many times I've done a data poll with the Semrush team, looked at the data like something's not right here. And I don't even know. I just know something isn't right. Let's dig in and see did something go wrong somewhere or are we actually comfortable with the results? I'm like, oh, actually what we ended up doing was there was an over-emphasis on this type of keyword or there was an over whatever, whatever on this. Let's try to adjust the data set and run it again and see, but you need to have a team that's willing to do that and is dedicated to do that. And somebody like me, or let's say like Patrick Stocks or like Dr. Pete who are working at these tool who are-
Crystal Carter:
Marcus Tober.
Mordy Oberstein:
SEO people. Marcus Tober, et cetera totally. Who are big data SEO people who could say, no, no, no, I know just looking at this, something's off somewhere. Let's re-figure this thing out.
Crystal Carter:
Yeah, yeah, yeah, yeah, yeah. You definitely have to dig into those. And I think that the tricky thing is with the big data studies is that very often what people want to hear about is the brand new thing. They want to hear about the brand new thing. It was-
Mordy Oberstein:
It was the hardest thing to pull.
Crystal Carter:
AI overviews. Exactly, exactly. Right? Because you don't have historic data and you've just set up this tool to identify it on the SERP. So for instance, AI overviews is something that's on there and everybody wants to know what's going on with AI overviews and there's various different studies that keep coming out and there's various tools that are trying to measure it, but they're just tricky to keep up with and they're tricky to quantify it.
Mordy Oberstein:
We spoke about this on the previous podcast episode about this and it was in our recent Searchlight newsletter, make sure you subscribe, but in the newsletter we talked about, yeah, the studies are great, but if you start thinking into it a little bit more, all these different tool providers are trying to catch up with each other. They did a study, we need to do a study. They did a study, we need to do a study. And that's helpful for them and it's good, it's good for everybody. You get data, they get clicks and likes and whatever and signups and whatever, but there also is a little bit of a need for deeper data and don't think that with that data you have the full story. You don't yet. No one has it yet.
Crystal Carter:
Right, right, right, right, right. Entirely. And the methodology point that you brought up is super, super important as well. Understanding, we looked at SERP results in the United States, we looked at the health vertical, we looked at newspaper sites or things like that. Because like you said, sometimes they're like, oh, we looked at 700,000 keywords. Well, for some websites that's just their ranking keywords. I literally had just logged into Semrush and I don't know, sometimes they have URLs that are there, and the first one I clicked, wildlifeorg.com, they rank for 300,000 keywords. So if you said, we assess 300,000 keywords, you're still only assessing one website.
Mordy Oberstein:
You could assess 5 million keywords. But if they're all the same kind of keyword, how to buy socks, how to buy pants, how to buy a shirt, how to buy glasses, how to buy earrings, how to buy a tree, how to buy a light, how to buy a computer. I'm just looking into the things I'm looking at right now.
Crystal Carter:
You're just naming things in the room.
Mordy Oberstein:
It's all going to be the same kind of SERP of the same kind of results,
Crystal Carter:
Right, right, right. So it's really important to think about that. And that's a really important thing when you're thinking about any kind of information. In academic studies, they always have a section that talks about methodology.
Mordy Oberstein:
And it's called, by the way, limitations. That's the name of it. Because every methodology has limitations.
Crystal Carter:
Right. And you have your variables. It's all part of the scientific method. All these studies are using scientific method. So you have your hypothesis, you have your variables, you have your controls, you have your criteria, you have all of that sort of stuff. And it's really important that you're looking at that.
Mordy Oberstein:
$1,000 to the SEO tool who instead of calling their section methodology, calls it what the academics call limitations.
Crystal Carter:
Okay. Okay.
Mordy Oberstein:
It's not a good marketing thing. You would never do that. That's why I'm offering a thousand dollars. I'm not saying you should do that, by the way.
Crystal Carter:
I mean, you might have somebody calling you up for a check. I'm just saying.
Mordy Oberstein:
Okay. Yeah.
Crystal Carter:
Okay. All right, so now moving on to my favorite type of study, which I think is my favorite type of, and I think a lot of SEOs really like these ones. So I'm calling these targeted studies. This is my verbiage, my terminology, but these are my favorite. This is essentially where people are combining the best of a case study set up with the best of a big data studies and they scale them up. So Cyrus Shepard is known for this type of research. He did a big, big case study on title tags. They looked at lots of title tags and variations on them, and they tested all of these title tags and said, "This has this. According to the data that we have in 60% of cases, this is the outcome. In 70% of cases, this is the outcome." That sort of thing.
Lily Ray is known for her studies where she monitors whole cohorts of websites that are affected by the same algorithm updates. So she's been studying winners and losers, and so she'll look at those sorts of things like individual websites and then she'll do a deep dive into single websites as well. This is something that you look at as well, Mordy, when you're doing your studies as well. So you'll say this is the big picture. This is a case study based on this one particular one. Glenn Gabe is somebody who's great for this as well, where he will share lots of information of the helpful content updates seems to be having this impact at scale. Here are a few people that I've been watching who are having that situation.
The thing that's tricky about some of these ones, and I think they're great, I think they're wonderful, but the thing that's tricky about these ones is if you are monitoring some sites and you're not inside the site, then sometimes you're flying a little bit blind in terms of the] study.
Mordy Oberstein:
It's anecdotal or can be anecdotal.
Crystal Carter:
Well, so I had a situation where, so at MozCon I was talking about forums and all of the data that I could see from all the tools was telling me that there was definitely a trend of around September in 2023 loads of forum sites saw a big uptick in terms of traffic. Reddit saw a huge hockey stick and loads of legacy forums saw this as well. And the numbers that I found in terms of the numbers was from one particular website said that it was up 7000% according to one SEO tool. At MozCon, one of the guys from the website was like, I'm not sure what they're measuring because I am only seeing two or three times as much traffic, not 7000% more traffic. We did see an uptick, a big uptick, but not exactly those numbers. So that I think is really, really interesting.
So sometimes with some of the tools, again, depending on the tools that people are using for some of these targeted studies, sometimes there's limitations with regards to that. So when people will share information, it's very important that they share what kind of tools that they're using. So for instance, Lily Ray very often uses Systrix and she'll say, this is what Systrix is showing. And Glenn Gabe always shares which tool he's using, and sometimes he'll show multiple tools as well. And that's really important to see as well because if you have a case where somebody's like, well, that's not what's reflecting in my traffic. I know you're looking at this website, but our website actually isn't seeing that. If you actually end up speaking to the SEO as I did in that case, then you can say, oh, well the tool is showing me this. Based on the evidence I've got, this is what I've got. But yeah, those are the challenges with those, but they're very, very useful for keeping track of trends.
Mordy Oberstein:
I love those. I love those a lot. And what's great about what Lily and what Glenn do, and Cyrus, that they're not just tracking five or six sites. Glenn has, I don't know, an index of 400 sites. I'm making up the number. I don't know the exact number he has, but it's a lot. It's a lot of sites. So it's not just a small sample size. But yeah, it can be like what they're seeing and what you're seeing could be very, very different. Looking at a pattern within a core update kind of thing, and I find 10 examples of this, if I write up a post highlighting two or three of those examples, the first thing I try to start, this is just what I'm seeing. This is one sliver of the entire internet, sliver of a sliver of the internet. You could be seeing something completely different, and you probably are because there's probably a million things happening at one time.
Crystal Carter:
All the time. I think it's really, really valuable. I think particularly for folks who are trying to do the day-to-day, it can be really, really valuable to have the combination because I think that if you have somebody just giving you loads of data, sometimes I'm like, well, now what? What do I do with all this information? So even if you do have somebody who's like, this is the sliver of what I've seen. Based on this data, what do we actually do now?
Mordy Oberstein:
Exactly.
Crystal Carter:
It's really, really-
Mordy Oberstein:
That's a whole other can of worms.
Crystal Carter:
Yes. It's really, really useful. But I think also data just becomes increasingly more important for SEOs and understanding these studies and being able to report and tell stories in terms of this is really, really important. I think Lily Ray is somebody who in particular, she recently changed her job title. She's head of strategy and research, not just strategy, but research. So that's recognition for all of the studies that she's been doing. And I've seen lots more SEOs who have switched either to being data folks or have added that as part of their role or managing teams there as well. So being able to understand these data studies, being able to write your own, being able to assess data coherently is super, super important to SEOs today.
Mordy Oberstein:
I just want to end off with a quick point that Barry Adams once made, having some data is better than having no data. None of the data is perfect. All of it points to a certain direction. It's all, this is the trend or this is where things are heading. It's very directional. Even if it's not correlation doesn't mean it caused it. Blah, blah, blah. Forget that discussion. Even if it's not that discussion, it's still not perfect. It's still not going to be a hundred percent. So take it all in the right framework and the right frame of mind.
Crystal Carter:
Absolutely.
Mordy Oberstein:
And the second thing is data studies are like your family, everyone's got an agenda. So keep in mind who's writing it and why, and that's all I'm going to say.
Crystal Carter:
Okay.
Mordy Oberstein:
Okay, moving on from that little spicy little bit. Sometimes we need to share complex data with our own audiences and with our own clients or with our own teams. How do you go about doing that? Well, here's Jess Joyce about how or what things you should consider when translating complex data into content your audience, whoever that might be, internal, external, whatever, can actually understand.
Jess Joyce:
So you guys have asked what things should you consider when translating complex data into content your audience can understand whether that be an article or sharing data in a report, et cetera, et cetera, et cetera. And how I go about that. So I think it's the same approach that I take to every piece of content is try to land people with context. I think context is everything that search engines are missing because when you're Googling something, it can mean so many different things, which I know they've added updates for, which is great, but it can still mean so many other things and other contexts that go into things. So landing things with context really helps. We're starting to see that being a noob about it is probably the wrong approach to take to it. Answering the what is content out of the gate is just not a great idea, but ensuring that your audience, so doing your audience checking and such is really good to do.
Knowing your ICP and all those marketing terms so that you know who you're talking to so you can use the same terminology that they're using is really helpful. And then also making sure that you use examples in that, whether that be quotes or visualizations or something to engage the user throughout it and making sure you highlight the key points that you want to ensure that you nail down with whatever piece of content that you're writing or data, because especially if it's more complex, I don't want to read through 300 pages of a complex data report when I can just get the TLDR. We tell folks all the time to write out key summaries, the key takeaways that top stuff that's really helpful with like Nerd Wallet content, they do that really well. So I think translating that into content that anyone can understand with summaries or top tips is really helpful for everyone and not just search engines.
Mordy Oberstein:
Thank you so much. Jess. Make sure to give Jess a follow-over on social media, link to her LinkedIn profile in the show notes. It's interesting to point out that change in what people are expecting from when you're transmitting that content and how you start off. I agree the start off point is a little bit higher level than what it used to be, so do that, which is good for complex stuff.
Crystal Carter:
Yeah, I think also presumably you can just link people. If they're not sure, You can just, you don't know what this is, here's a link. But I think also we have tools that are able to cover that. So I get complicated documents from my insurance company about various different things. If I don't know what that word means, then I just go, "ChatGPT, what does that mean?" And it goes, "It means this." And I go, "Thank you." And then you can move forward.
But I think that you want to spend the bulk of the communication dissecting the complicated ideas. That's what you need to do. And I don't think you need to not necessarily explain yourself. You can qualify it, you can certainly explain all your acronyms and things like that, but you can jump right in. And I think that it's really important and really valuable to have that as a skill because there's been tons of times when I've talked to clients and you give them data and they're like, what is this? And if you're not able to dissect it, if you're not able to explain yourself, then it can become a big challenge for folks.
Mordy Oberstein:
Huge challenge. Now talk is cheap, and I mentioned this on social media before, it's why I talk a lot. If talk was expensive, I wouldn't do as much talking as I currently do.
Crystal Carter:
What?
Mordy Oberstein:
That said, why don't we have a look at some actual SEO studies that talk about why we thought they were effective so that you can approach SEO studies in a brand new light when you approach them with a segment called It's New.
Speaker 4:
Oh, I'm sorry.
Mordy Oberstein:
It's new because it's a new way for you to approach SEO studies.
Crystal Carter:
Yes.
Mordy Oberstein:
Yes. That because it's a new thing on the SERP, which is why we need a next segment.
Crystal Carter:
Or a new study.
Mordy Oberstein:
Or a new study. Yeah, these are all old, really old.
Crystal Carter:
This one's our 2022.
Mordy Oberstein:
I went with what I knew was good.
Crystal Carter:
Okay.
Mordy Oberstein:
I mean I could have gone the other direction. Here's a bunch of bad ones, let's bash them. That would've been good radio.
Crystal Carter:
No, no.
Mordy Oberstein:
But I knew you wouldn't appreciate that.
Crystal Carter:
All right, what you got?
Mordy Oberstein:
What do I got? I got from 2022, but it's a famous SEO study from Patrick Stocks who was a booby over at Ahrefs, almost half of Google Search Console clicks go to hit in terms of study by Ahrefs, and it's one of these seminal SEO studies where we learn that, wait a second, the GSC data isn't as transparent as you actually think, and it's not as much data there for you as you actually think, and that's not all there like you might actually think. And it really changed the minds or the mindset of some SEOs' take towards Google Search Console. And the things I liked about it was it gave you practical tips. It wasn't just like, here's big data for big data's sake. There was a purpose. We rely on search console to X extent. Maybe that's not actually the case and the author is super trustworthy. Patrick is super-duper... If Patrick says it as it pertains to SEO, like I don't know about Patrick's talking about comedy, if Patrick's an expert, he's very funny though, but for SEO data, take it because he's great.
Crystal Carter:
So here's the thing. I'm looking at this study and there's a couple of great trust signals and great high value signals that they have straight out the gate. So first of all, they have Patrick Stocks who's an established SEO author and it's got in his bio and he's done some great stuff and he's worked for the team for ages. Then it says it's been reviewed by two other people who are also solid, fantastic folks. So in the first paragraph they explain why you should check out this study because most SEOs consider Google Search Console as their source of trust. What if I told you Google Search Console doesn't tell you all of the keywords that you get traffic from. In fact, the tool doesn't show you a term for nearly half the clicks. That, now I'm hooked. I'm interested. They've established the issue that they're trying to address.
So that's their hypothesis is that. And then they say these instances of hidden terms account for 46% of all clicks in our study, and our study includes one month of data across 146,741 websites and nearly 9 billion total clicks. So they've told you the data, they've told you the websites, they've told you the scale of it, they've told you all of that sort of stuff within the first, completely above the fold, it tells you all of that sort of stuff. Also, it says that they last updated in November 2022. So this is also really useful as well. So in internet time, November '22, which Google Search Console, that's probably fine. That's probably not the difference, that probably hasn't changed much between now and then, but it's worth double checking or something if there's been any updates to that, if there's been any updates around this conversation, if you're interested in this particular topic. Also, and this tells you the parameters of it that basically this is a combination study.
This is one of those targeted studies where basically they've got big scale data and they've got account data as well, and then you can take from that what you want, but they laid out all their cards on the table. Those are great trust signals. And as somebody with a brain, I can go through and I can go, okay, based on all of this, I will now assess your study and see how relevant this is to my particular situation or how interesting this is, and I can compare and contrast what I see in my Google Search Console with what you've got here.
Mordy Oberstein:
Exactly. So great study. Okay, number two comes from some dude named Mordy and I chose this one not because I'm narcissistic because I was looking for three quick examples so I can get on with my day.
Crystal Carter:
Lazy
Mordy Oberstein:
Time efficient.
Crystal Carter:
Don't work harder, work smarter.
Mordy Oberstein:
Right. Hey, that's our newsletter segment. Market smarter, not harder. Google's product reviews algorithm update winners and losers. This comes back from 2021, it's an article from SEJ. The reason why I included this one is not because it's a great study and it's fantastic. It's that there's an entire section, my approach to analyzing the update was specifically word, I found five or six websites that showed this pattern. I was very hesitant to write about to begin with because that's not a lot of sites. But it was interesting, and I literally wrote somewhere, here, to be clear, what I'm about to share is based on my qualitative analysis. It is not a definitive study based on deep data.
Crystal Carter:
Right.
Mordy Oberstein:
Take it for what it is. So that transparency was great. Even the authorship is so-so.
Crystal Carter:
That Mordy guy, I don't know-
Mordy Oberstein:
I don't know what he is thinking.
Crystal Carter:
Oh my gosh. But yeah, no, I think that that's super important though because it is useful and I think that the other thing is that even though it's a few websites that you're looking at, you've gone into them extensively and that amount of research is something that is valuable and the way that you compare them also gives people the ability to compare themselves to that set of criteria.
Mordy Oberstein:
So in comparing different websites to each other and their rankings after an update and why maybe one went up and one went down. And showing, by the way, there's correlation between the two websites. There's inverse ranking patterns. Like, okay, well you could see in some of the screenshots in the article, they're just opposite patterns. They clearly Google, we like this one, don't like that one. Get this one in there. Get rid of that one. And I spent hours and hours reading through the website's content to do this. It wasn't like a five-minute thing. Anyway, enough about me and on to Moz and Dr. Pete who how could we have a conversation about studies out this doozy of a study from, I don't know what you... 2024. It was updated in 2024. Look at that. Something current and relevant, which is unique for me in general in life.
Anyway, from Dr. Pete, charting 10 years of the Google algorithm, and the reason why I like this study is the absolute phenomenal use of visuals in a deep data study. The visual became a hack. That's how good the visual was.
Crystal Carter:
Yes.
Mordy Oberstein:
Also, we tried to contextualize the data, why he thinks it was happening and he's cleared his opinion, but I always appreciate that's what you were saying before, why do I care about this? He's trying to show what this might actually mean for you or mean for looking at the SERP. And it's not very long. Not every deep dive needs to be incredibly long. He did it really quickly.
Crystal Carter:
This again comes back to the tools providers or whoever's supplying the data, what they have at their disposal. So I asked Dr. Pete for some algo information on this, and they've had MozCast for years. He's able to pull on 10 years of data that they've been tracking consistently, and so he's able to pull on the data set. And since he's been doing it consistently, he's able to build on that very quickly. So I asked him for some data and he was like, "Yeah." And just pulled the full data set for me very kindly because he already has that stuff.
And I think that people who've been studying something, just like you would have a professor at a university who has been studying, I don't know, the Romans or something for like 15, 20, 30 years, if you have somebody who's been studying the algorithm and using similar data set or using similar tools for 10 years, this ain't their first rodeo. They can see what's new, they can see what's different, they can see when the algo's spicier than it normally is. And Dr. Pete's been somebody who's been doing that for all this time and he's got the chops to write something shorter because he has all of the data behind it.
Mordy Oberstein:
Exactly. He's got that whole, especially around this topic, that intuition that you need in order to be like, okay, I see the data. I know what this means.
Crystal Carter:
Right. Right, right, right.
Mordy Oberstein:
Here we go. Let's roll.
Crystal Carter:
Yeah, I think it's totally good. I also want to shout out, I know this isn't on the sheet or whatever, but speaking of people who've been studying things for a while or particular conversations, I want to shout out Rand Fishkin zero-click studies.
Mordy Oberstein:
Yeah, that's got a lot of play through the years.
Crystal Carter:
Right. So he's been studying these for ages. 2019, he published a study about zero-click searches. 2021, he published another study about zero-click searches. 2024, he recently published another study about zero-click searches. And this is something that the SEO space has been fascinated by the entire time, and I think that within his studies, the most recent one, he has a whole section, again, this is super valuable, but he has a whole section that says caveats and data limitations.
He also cites his previous two studies, so if you've been following this conversation or if you're new to the conversation, you can go back and read the other articles and see how this has evolved over the time. But yeah, he has a whole section about caveats and limitations. He's got five different caveats to explain all of the different information, all of the other things that he talks about, like click stream panelists might have changed or iOS, there's minimal coverage of mobile iOS devices are currently available within the panel. He explains all of those things, and I think that while some people might think when they're presenting data to clients or whatever, that talking about the caveats might make their study seem less robust, I think that declaring your caveats makes it seem more robust.
Mordy Oberstein:
A thousand percent. A gazillion percent. Yeah. You knows who's a study unto himself?
Crystal Carter:
Who's that?
Mordy Oberstein:
Barry. Barry's a study unto himself, and to the psychology of the human mind, really.
Crystal Carter:
I think so. How can he be more efficient? That's what he studies. Efficiency.
Mordy Oberstein:
I think he should really put his cell phone in his shirt pocket up top. That's why when he has to answer it, it's right there. I don't know why he puts it in his pants pocket like every other person does. I wouldn't.
Crystal Carter:
I wouldn't.
Mordy Oberstein:
Yeah. So here's this week's Snappy News. Snappy news, snappy News, snappy news. Three for you this week. First up from Barry Schwartz over at SCRoundtable.com. Google Search Console finally drops the page experience reports. So the page experience report in search console is no longer, you can still track the HTTPS report and the corporate files report separately, fine, but the actual page experience report is no more. Per Google, they wrote, "We're removing the page experience report and search console. That page summarized data from the corporate vitals and HTTPS reports, which will continue to be available as they are. We decided to remove this page to reduce unnecessary clutter in search console." Yada, yada, yada, yada, yada.
Barry wrote, and I could not have said it better myself, "I wish they would just say the reason was because SEO is obsessed about it as a ranking factor when in a reality it makes almost zero impact on ranking. But no, Google said it was to reduce unnecessary clutter." Barry nails it on the head. There's nothing else I need to add to that. He just completely nailed it. Wow. Barry going full “scorched-earth” there. Okay. Also from Barry, Schwartz, this time on search engine land, Google site reputation abuse policy now includes first party involvement or oversight of content. So we've seen this before where Google has gone after third party hosted content that really's got nothing to do with the website, like on the coupon sites, yada, yada, yada.
To quote, it's new. Yada, yada, yada. Now, Google's saying, yeah, even if you are involved as the website itself, let's say with, I don't know, you hire a third party to manage the content on the website, everything that they're doing is completely under your umbrella. You understand it, you approve it, you sign off on everything. Still a problem. Google also said, this is not algorithmic. They did say that there are algorithmic things in place to make sure that you stay in your lane and focus on what you focus on. Chris Long has a really good article linked to it also on Search Engine Land, where basically saying, "The fact that this is manual and Google can't get rid of this stuff algorithmically points out a really big flaw in how Google's able to handle this content as a whole. We've talked about that on its new, a few times we've griped about that." I completely agree with Chris.
Onto Semrush.com. This one from yours, truly. I did a study exploring URL volatility in Google's AI overviews. It's a little data study. We looked at 1500 keywords that produced a AI overview for at least 20 out of 30 or 31 actually days across October, and then pulling out the data around how consistent were the URLs inside the AI overviews. The point being, hey, you might want to target an AI overview, but if your URLs can't be consistently shown there, is it really worth it or not?
Now, obviously it, quote, unquote, "It all depends on the keyword." And it does. There's a lot of stats in here. I'll give you some of the top stats that I saw. One, out of all the 1500 keywords that I looked at, 0% of them show the same URLs every single day across the data period. Zero. Not one. 91% of the URLs in the study were at some point removed from the AI overview. That's a lot. We only saw 43% of them return to the AI overview within the data period. They may have returned later. I don't know. But in the data period, only 43% of them ever returned, so more than half left the AI overview, never came back.
Also, the average number of consecutive days that a URL's showed inside the AI overview was 3.3 on mobile and 3.9, so like four days on desktop. There is a lot of fluctuation inside of the AI overviews in terms of URLs. Does that mean your URLs are not showing consistently day in, day out? No. Don't know about your specific URLs. You need to track it. I'm just pointing out the average across a relatively limited dataset, but it does point out the need, in my opinion, like I wrote at the end of the article, to really track the heck out of your AI overviews and the URL placement, because it could be, yeah, you're getting in there, but then tomorrow you're not, and the day after you're not, and then you're back again, and how worthwhile is the focus on that?
By the way, one more thing we really do need to look at is the CTR. Yes, AI overviews might show up. We have data on that when, how often, blah, blah, blah. URL consistency? Now we have data on that, but what's the CTR? Are people actually engaging with the AI overviews and clicking? Don't know yet, so there's a lot of unknowns in this AI overview picture, but now we have a little bit less unknown with my latest data study. Pat on the back to me, yada, yada, yada. That's this week's snappy news. I can't stop doing the yada, yada, yada thing from Barry. Sorry, Barry. Was stealing your thunder. By the way, for the record. Barry says he does keep his cell phone in his shirt pocket so that it's easier to pull out.
Crystal Carter:
Yeah, but your hands are at your hips.
Mordy Oberstein:
The whole thing is awkward. Let's say, I don't know, you're leaning over to flush the toilet. It doesn't fall out into the toilet? I'm going to ask him that?
Crystal Carter:
You ask that.
Mordy Oberstein:
Yeah. I'm going to write it on my list of things to ask Barry.
Crystal Carter:
Okay.
Mordy Oberstein:
Yep. Does your phone fall out in the toilet? If yes, I'm never borrowing it. Not that I ever borrow his phone.
Crystal Carter:
There's some studies on that.
Mordy Oberstein:
There probably are studies on that.
Crystal Carter:
There are. There's studies on how clean people's phones are. I'm just saying. Don't look.
Mordy Oberstein:
Now, if you're looking for a great collection of information and SEO studies and all sorts of things, Backlinko has some really great information for you and Leigh McKenzie who's driving the organic content over at Backlinko will be a great person for you to follow. He's always sharing roundup of all sorts of information around SEO on LinkedIn, so give Leigh over at Backlinko, a Semrush company, a follow.
Crystal Carter:
Yeah, do that.
Mordy Oberstein:
Yeah. He also listed his podcast, one of the best podcasts ever.
Crystal Carter:
He's got great taste.
Mordy Oberstein:
Yeah, great take. This wasn't quid pro quo, by the way. We didn't say, oh, we'll feature. It was organic. We didn't even know.
Crystal Carter:
Hey, and I tell you what, if you also think that we're one of the best podcasts, leave us a review. Just saying. That's some great data. That's some great data for us and for everybody who's using Spotify or Apple or any of your other favorite podcast channels.
Mordy Oberstein:
That is quite logical.
Crystal Carter:
Thank you.
Mordy Oberstein:
Fascinating.
Crystal Carter:
I appreciate it.
Mordy Oberstein:
Yes. I'm raising that one eyebrow to be the other Star Trek nerd data person guy.
Crystal Carter:
Yes.
Mordy Oberstein:
I've killed the moment.
Crystal Carter:
I have a lot of Star Trek references. I'm sorry.
Mordy Oberstein:
All right. In that case, thanks for joining us on the SERP's Up podcast. Oh, you're going to miss us? Don't worry. We're back next week with a new episode. We dive into who and when to hire if you're a digital marketing team. Look forward wherever you consume your podcast or on the Wix Studio SEO Learning Hub over at wix.com/SEO/learn. Looking to learn more about SEO, check out all the great content and webinars on the Wix Studio Learning Hub at, you guessed it, wix.com/SEO/learn. Don't forget to give us a review on iTunes or our rating on Spotify. Until next time, peace, love, and SEO.