top of page

Putting SEO to the test

Who is SEO testing for? When is it worthwhile to run an SEO test? What if you don’t have a huge amount of data, is SEO testing still a viable option?

Subjecting your website's SEO to testing can be extremely insightful. It can also be extremely complex.

Will Critchlow, CEO and founder of SearchPilot joins Wix’s own Mordy Oberstein, and Crystal Carter to evaluate how you should conduct SEO tests on your website.

Get insights into the guidelines for when it’s appropriate to run an SEO test and when the tactic may not be worth your while. Go beyond the SEO testing process by understanding how to analyze your findings and apply them to your SEO strategy.

Keep your eyes on your own paper as this week we discuss the ins and outs of SEO testing here on the SERP’s Up SEO Podcast!

Episode 65

|

December 6, 2023 | 48 MIN

00:00 / 48:17
Putting SEO to the test

This week’s guests

Will Critchlow

Will Critchlow is CEO of SearchPilot, a company that spun out of his previous business Distilled, which was acquired by Brainlabs in early 2020. SearchPilot is an enterprise SEO A/B testing platform that proves the value of SEO for the world’s biggest websites by empowering them to make agile changes and test their impact.

Transcript

Mordy Oberstein:

It's the new wave of SEO podcasting. Welcome to SERP's Up. Aloha, mahala, for joining the SERP's Up Podcast. We're pushing out some groovy new insights around what's happening in SEO. I'm already over to the SEO Branding here at Wix. I'm joined by our Head of SEO Communications, the true tried and tested, Crystal Carter.

Crystal Carter:

That was a shorter intro than normal. Are you testing out something different?

Mordy Oberstein:

Because I didn't know how to... The tested part is a part on today's topic, but I didn't know how to tie that into a longer...

Crystal Carter:

So you're experimenting, are you?

Mordy Oberstein:

Yeah, maybe testing the... Yeah. If you're listening, which one do you like better, when I do a pun tied into the Crystal's title or I just go full on, "The incredible, the fabulous, the amazing, the unparalleled Crystal Carter"?

Crystal Carter:

Don't know where you get all of the adjectives from. I don't know where they all come from. Literally, I ran out of adjectives years ago. I just do a fire emoji, whatever it is in my fire Emoji.

Mordy Oberstein:

I officially have diarrhea of the mouth. So it helps.

Crystal Carter:

That's clinical condition. I hope it's not. I hope you're okay.

Mordy Oberstein:

You can talk to my wife, ask her. The SERP's Up Podcast is brought to you by Wix, where you can not only subscribe to our monthly SEO newsletter, Searchlight, over at wix.com/seo/learn/newsletter, but where you can also create custom widgets that you can use across all the sites that you manage for your clients, or just some of the sites or many of the sites that you manage for your clients. Really, what I'm saying is it's up to you where you want to replicate those custom widgets that you create. And I'm saying you can do it with Wix Studio. Perhaps by the way, you can take two versions of that widget and test each one as today's topic is SEO testing.

That's right. We are the Doc Browns of the digital world. And unlike fourth graders everywhere, in SEO, we love tests and testing. We're diving into what you should or shouldn't run with an SEO test, when you should or shouldn't run an SEO test. And how do you run an SEO test if you need a gargantuan amount of data to draw any conclusions. To help us prep for this SEO test in the sky, CEO and founder of SearchPilot, Will Critchlow is set to join us in just a few minutes.

We'll also give you a great and classic run through of a tool to help you run some SEO research that kind of touches upon SEO testing in a unique way. And of course, we have your snappiest of SEO news and who you should be following on social media for more SEO awesomeness. So close your books and keep your eyes on your own damn papers as episode number 64 of the SERP's Up Podcast, which is your conceptions of SEO experimentations to the test. By the way, when I say, "Eyes on your damn own paper," I say that with full conviction as a former teacher, "Eyes on your own damn paper."

Crystal Carter:

I think that's a really important thing to think about. But although it's strange though, because at school, they're always like, "No, no, no, your own thing, your own thing." But then when you get into the world of work, they're like, "You have to work together." And I'm like, "No, it's mine. It's mine. I know this and they don't. And I should get a gold star." But that's not how life is. You have to share.

Mordy Oberstein:

I used to always like making jokes that went a little bit, just a little bit over the kids' heads. That's my way of being aggressive. It was a fifth grade, sixth grade class and every year, it was Baltimore City. The whole state rather, makes you do this whole ridiculous testing, set of testing every year. And everyone, it's a whole day of inspectors coming around to see who's doing what and make sure all the rules are being followed. Some kid had his notebook on his desk. I'm like, "Dude, you can't have a notebook on your desk." He's like, "What notebook? Where did that come from?" I'm like, "Oh, it must be the Immaculate Notebook." Anyway, anyway, as the idea of testing children brings up a lot of anxiety for me and the children, we would like to welcome to the show, Will Critchlow. Hey Will, how are you?

Will Critchlow:

Hey, good to join you. Thanks for having me on the show.

Crystal Carter:

So absolutely honored. I think when we think of as you're testing, you're the first person I think of-

Will Critchlow:

Good. Good.

Crystal Carter:

... if I'm completely honest. And so yeah, it's a real honor to have you and to have you go through your methods, and talk about how you can get some good results and set some good hypotheses, and set out to get something that's actually actionable out of a test.

Will Critchlow:

Looking forward to it. Yeah.

Crystal Carter:

Yeah. Thank you so much.

Mordy Oberstein:

So great. Before we get going, Mark is going to mark a quick plug for what you do.

Will Critchlow:

Yeah. So as you said, I'm founder and CEO at SearchPilot. You can find us at searchpilot.com or at SearchPilot on the social channels. So we focus really at the enterprise end of the market. We make software that helps really big websites test for SEO. And so our customers tend to be in travel, real estate, jobs, E-commerce, those kinds of verticals where you have really huge websites. And our software helps them run SEO tests just like the ones that we're going to talk about today. And out of the back of that, we obviously get a load of learnings, things that we discover, and we try and give some of that back to the community and publish a bunch of tests and test results on our website at searchpilot.com.

Crystal Carter:

I should highly recommend anybody who's interested in this to visit the SearchPilot case studies folder. They've got it broken down into a lot of different things. So they've got test inconclusive, negative, winning, things like that, and they've always got a control at a variant and they really break down a lot of good examples of how you can test and things you can test. One of the ones that's currently at the top is then, how valuable is unique content for SEO and testing? That is interesting. Will, I can see your eyes lighting up on that. I don't know if you want to jump in on that one.

Will Critchlow:

Well, the whole area of the site, you're absolutely right. So we basically try to produce the stuff that I wish had existed as content when I was a consultant. And so it all feeds into, we have an email newsletter you can sign up on the case study section and we just send these out every couple of weeks. And they're all coming from these tests that we've run with our customers. And I find it fascinating. I find it fascinating to test myself and figure out if I can work out what's going to have gone on. And yet, specifically things like that one where you're saying we all have the best practices in mind, we all know what Google says, we all know what the soundbites are. And many of us, we can give them conference presentations about this stuff. But when you try and quantify it, it gets really tough to kind of actually say, "Is it worth investing the money that it's going to take to upgrade this particular piece of content or this particular section of the page?

Mordy Oberstein:

So maybe that's a good place to start, sort of loop in everybody who's listening to this podcast, what is an SEO test exactly and what are you testing?

Will Critchlow:

Yeah. Great question. So most people can be familiar with conversion rate testing or user experience testing, the kind of thing that you'd run typically with a JavaScript plugin on your website, where you show different versions of a page to different website visitors. And then you see whether the group who saw version A convert better or the group who saw version B convert better. And that's been around for well over a decade now and folks are generally familiar with that kind of technology. We're trying to bring that same level of rigor and statistical approach to the SEO world, which as we all know, we've all been in it for a while, has a history of being less than scientific in some areas. So the kinds of tests that we are talking about when I say SEO tests, we're trying to run these controlled scientific experiments.

And unlike a CRO test where you show two versions of the same page to half your users at a time, the way an SEO test works, because you kind of have to think of Googlebot as one visitor, is we take a large site section, so a whole bunch of pages that are in what we call a site section, a group of pages with a similar page template. So think for example, product listing pages on an E-commerce website and we make changes to some of those pages and not to others. So we have a controlled group of pages and a variant group of pages. And all users including Googlebot, see that there's no cloaking, no duplication, all that kind of stuff.

And then by looking at the relative performance of the different groups of pages, you can run a bunch of statistics and figure out a confidence interval on the impact of the specific change that you made. And the crucial thing here is that that then allows us to account for all of the confounding variables. So whether it's seasonality, other changes to the website, Google algorithm updates, things your competitors are doing, all these things that mean your analytics moves around all over the place, we're controlling for all of that. And we're saying, "Given all of the stuff that's going on in the outside world, here's the impact of this specific, very small, very specific change to those pages."

Crystal Carter:

And I think just to break down a couple of those things, so you mentioned a few things showing. So just to catch up anybody who's new to these ideas, so you said showing one set of information to one group and showing another group to another one that's referred to as an AD test. So that's if anybody wants to look up more and more of that. You also talked about controls and variables, and I think that that's a really important thing to dive into a little bit about how you identify the controls. So if you were to think of, I don't know, I'm trying to think of something like a medicine for instance. Normally, if they're testing a new medicine, they'll have a control group which gets a placebo, which is essentially sugar pill. So it changes nothing. They've changed to nothing. So essentially on a website, you're doing something similar where you have one site where you haven't changed anything.

Will Critchlow:

That's right. So yeah, the control pages are exactly like that. They're the ones where nothing has changed. And so that doesn't of course, mean that traffic stays completely flat on those pages. Right? It can increase or decrease for a million other reasons. But the fact that we know that we didn't do anything to those pages means that we can take that data, apply it to our analysis of what happened to the pages that did change, and we can tease out what happened because we did something versus what happened just because the world changed.

Crystal Carter:

And you're doing those things at the same time. So let's say over the month of July, so let's say you had the blog, right? Let's say you had the blog and you had one set of things and you're like, "We're not going to change these blogs at all. We're going to change these blogs, so that one is the control group, one is the other, and it happens at the same time." And so just to catch few people up. And presumably, you also have to tell everyone involved in the team not to touch anything on those pages, including devs.

Will Critchlow:

So actually, there is a lot of coordination and communication needed, which we can kind of get into. But actually, it is okay for that to be site-wide changes to the site in the meantime, as this stuff is happening. Because that all gets taken care of by the control and variant because those changes tend to happen randomly. They don't happen to every page across the control group and the variant group and therefore, they count seasonality or algorithm updates or whatever else, or they happen randomly. It's like this page gets it, that page doesn't. But those pages, some of them will be in the control group and some of them will be in the variant group.

So where we are coming from is as I say, we're working on these really big websites and that's where you can get into this really detail-level statistical analysis. Some of our tests, we're calling us statistically confident result of plus 2% of organic traffic, which just disappears into the noise on a small business website, or a typical blog or whatever else. And really, what I'm hoping, and this will again, we'll get deeper into this, what I'm hoping is that the work that we are doing on these big websites by publishing some of that analysis and by deconstructing it and doing the thinking about it, we get to help even those small business owners or those folks who are working on smaller websites. And that's kind of like as I say, I wish that had existed when I was a consultant, kind of paying it forward into that part of the industry as well.

Crystal Carter:

Well, I think it's so valuable because even 2%, and I know that if you're starting with a small blog, let's say you've got... I don't know, you've got a thousand views a month or something like that and let's say you think 2%, the thing about SEO is it compounds. And I think the other thing is once you learn how to test hypothesis in SEO, then you can grow really quickly. So it might be that you learn how to test that, and then the next time, you can test another thing, and then you're adding and adding and adding.

Because I think so often, people will just do something and have no idea what impact they expect it to have, have no idea whether it had any impact. And I think even with small teams, one of the things I've seen, and this is why I wanted to pull up the control thing, is that sometimes they'll be like, "Oh, but we did a PPC change," or "Oh, but we did this, that, and the other change." And I think sometimes with SEO, particularly if you're helping someone with their SEO, you need to be able to evidence that you did the work. Do your tests help with that process?

Will Critchlow:

So definitely, they do on the kind of sites that we're working with where we get to... Because we're controlling out all that other stuff, so we get to say notwithstanding all these other effects, "Here's the impact of this change." And without going deep into our specialism too much, you get some really quite cool dashboards where you have a list of SEO initiatives and each one has an associated uplift or decrease associated with it and a degree of statistical confidence attached to it. And that's the kind of thing that really helps with broader communication with the business and I mentioned earlier, the lack of scientific rigor is one problem in SEO. I think another one is too many of us historically, have not been great at talking business. I'm guilty of this. I'd love geeking out about robots.txt or whatever it might be, hreflang, the business doesn't speak that language. That's not what the CMO, CFO, whoever else cares about, CEO. And we're trying to bridge that gap and help bring those metrics that the business cares about connected to the effort and the work that folks are putting in as well.

Mordy Oberstein:

What's a case off the top of your head, that you really saw that come through? You run an SEO test, and then you run through the whole thing and the outcome was, "Hey look, if we did this, your bottom line's going to increase by whatever percent."

Will Critchlow:

So actually, I don't think it's specifically the case study that you were mentioning, Crystal, but we had a very similar one, testing content quality, which we've had a few really fascinating ones that have led to businesses making specific investments in different areas or not investing in different areas. Because the big thing about this is very contextual. So it depends on your website, your industry, your place in the market, what your competitors are doing. But I remember one case where the team we were working with built a business case to invest in a whole load of high quality freelance written... Basically, bring in a bunch of editors and writers and overhaul a whole site section's worth of content.

And they did that with an initial test on, I think it was like a hundred pieces of content, proved the substantial uplift that came from it, quantified it, turned that into business metrics and said, "Look, the cost of doing this is X thousand dollars. The benefit is a bigger number. Should we go for it?" And that definitely helped with that kind of sign off. But the great thing about testing is you see the inverse where we've definitely had cases where folks are like, "Should we do this, run the numbers?" No, probably not. Actually, time and effort and money is better spent elsewhere.

Crystal Carter:

Right. Right. And running the test, that can save you so much of that hassle between going all the way down. And then people are like, "How come we didn't see? We did all that stuff and how come nothing happened?" And I think the other thing that's really important is that if you have that too many times, you lose some momentum with regards to SEO buy-in and SEO investment and things. So it's really important to-

Will Critchlow:

Yeah, for sure. So you definitely see it with the executives. Executives get bored of just hearing these pie-in-the-sky numbers that maybe are going to materialize year after year. We also see it in trust between SEO teams and product and engineering teams as well, because a lot of SEO requests are not very exciting engineering requests. It's like, "Hey guys, we want you to change the titles again." And they're like, "I mean, I was hoping to do some machine learning."

And there's a disconnect sometimes between the things that the SEO team wants the engineering product team to work on and what's exciting and interesting for an engineering product team to work on. And one of the things that we found is by running these tests, we're able to reduce the number of requests that go to engineering because we're now only asking them to do the good stuff, which might only be 20% of the work that we previously would've asked for to get all of the benefit. And crucially, each ticket that's raised has an uplift associated with it. And so even if it's not the most exciting day at work, it's a rewarding day at work because they know that they've added to the bottom line and-

Crystal Carter:

Right. And you can get a gold star.

Will Critchlow:

A gold star, a raise, a bonus, whatever it might be. Yeah.

Crystal Carter:

Right. Right. I think also, when you're working with devs, engineering teams, et cetera, it's always great to tell them about those uplifts like, "Hey, remember those title tags we asked you for?"

Will Critchlow:

Your work did this. Yeah, absolutely. And everybody wants to know their work is meaningful and that it makes a difference to the organization that they're working for. And part of our purpose at SearchPilot, part of it is the kind of SEO-centric. We want to prove the value of SEO for the world's biggest websites, is how we phrase it. But we have another piece, which is about creating great jobs. And that is almost like a layered approach where we're saying, "First of all, we want to have good jobs at SearchPilot," in like a direct sphere of influence.

But the next layer out is we want to make the jobs better for the folks that we're interacting with, whether it's our customers, whether it's their colleagues and their stakeholders, whether it's all these different things and we see that coming together. I love it. We occasionally share in our Slack when one of our customers posts a job ad that says, "We are hiring for an SEO manager," and there's a line item that says, "It's going to get to run SEO tests." And we're like, "Yes." That means that folks are going to work at this company because they're going to work with SearchPilot.

Crystal Carter:

That's awesome.

Will Critchlow:

If those dots join up, that's what gets me out of bed still, whatever. I'm 18 years into my SEO career now.

Mordy Oberstein:

Pushing gears just a little bit, if you're let's say the average site, it's not a huge enterprise site and you're looking at let's say both the content on the SearchPilot blog, you're trying to take away some lessons and you're saying, "You know what? Maybe I should run some tests." Can they take away lessons from what you're putting out in terms of the content and can they run some sort of equivalent test even though statistically speaking, they do not have the amount of data that you're working with?

Will Critchlow:

For sure. So I've been in this boat. Yeah, I've consulted with small organizations. I mean, SearchPilot, our own website in fact, is the kind of site you're talking about. We are not in our own target market. We unfortunately, can't really run these statistically significant SEO tests on our own site. So I feel you. If the folks who are listening, who are in that situation, I have a couple of recommendations. I think the biggest lessons that I've applied in those situations from the work that we are doing at scale is first of all, do read that content. We are putting out these test results that are tested on very large websites. And the fact that they won't apply exactly perfectly in your niche, in your industry on your site, doesn't mean that they're not directionally useful. And so in particular, consuming that it tells you, look, if this is only consistently we're seeing that these kinds of tests are very small uplifts or typically no big impact, that's probably going to not be a massive impact on your site either.

So it's a source of inspiration, a source of ideas, source of direction. The second piece I would say is there are things you can do that fall short of a full scientific double-blind controlled, whatever gold standard test that are still valuable. The first of those goes back to what you were saying Crystal, about communication, that a big part of it is communication and documentation. So one of the great things that we have is that every time we roll out a test, we know exactly when it started, we know exactly what we changed, we have that record of... And that could just be an Excel file, right? It does not have to be some kind of sophisticated platform, but knowing what you changed when, why you changed it, what your hypothesis was, why you thought this was a good idea, even just having that list of stuff you did and when and why, that's a huge part of building that business case when the boss comes back or the client comes back and says, "What have you been up to and what was the benefit of that?"

It also lets you then do basically, the next best thing to the kind of testing that we're talking about, which is before and after testing, which is to say you can at least look at how things have gone since you made the change, compared to how they were prior to making that change. I was going to say it's the thing I'm most sad about on the GA-III to GA-IV transition, but let's probably not go there, but the loss of the annotations is killing me.

Crystal Carter:

It's so simple.

Will Critchlow:

It's so sad. I mean, there's so much that's so sad, but that's so sad. But literally, you can just do the before and after. And you just have to be aware that that can easily be confounded by outside effects. So like your saying Crystal, if somebody releases a major update to your website or you change other things in your advertising or your strategy or whatever, that's going to screw with your data.

And I have a presentation, a conference deck that I run through sometimes, which highlights this where I show the before and after data, but we managed to launch right over the time that a Google algorithm update happened. And then two weeks later, it was Thanksgiving or whatever, there was some kind of holiday, and so the traffic goes all over the place. And you're like, you just can't expect to detect a couple of percentage points. You can tell if it's up 20%. You can tell if it's down 20%, but you're not going to be able to tell that kind of fine-grained data.

But my argument is that's okay, because if you are in that spot where you were talking about earlier, you're getting a thousand organic visits a month to your blog, an extra two visits a month, that's not what you're getting out of bed for. You're probably better putting that effort into writing a new blog post or connecting with some folks and doing some outreach or whatever it might be doing the thing, rather than doing the micro-level optimizations that can make all the difference in the kind of hyper-competitive big website space.

Mordy Oberstein:

If you are in that position and you do want to start doing some testing, what might be a good way to say, "Okay, this is not worth my time. I shouldn't bother testing it." Or you know what? This scenario might be worthwhile to run to try something and see maybe it did, maybe it didn't work. What are some, I guess, guidelines around when it might be worth it?

Will Critchlow:

So I've got some guidelines. I've also got a really specific example of a time when I did this. So as I mentioned on SearchPilot.com, our website isn't big enough to run our own kinds of tests. So what I did for ours was those case studies that we were talking about earlier, I literally downloaded a kind of CSV file of the titles and method descriptions on, however, many it was at the time when I did it, 100, 200. So that kind of order of magnitude and did a manual rewrite of titles and methods. And I think that's worked out well for us. Basically, we'd got in the habit of being quite introspective in how we described those things and I wanted to make them a bit more problem-centric, figuring that folks would be going on Google searching things like, should I put the brand at the beginning or end of my title tags?

And so I was like, our title should answer that question or should at least tell you that the answer to that question is in this post. So I just did that kind of quite opinionated, not very scientific rewrite, kept track of when I did it and could see the... Can't quantify exactly, but could see that it was performing better afterwards versus before.

So my general rule of thumb is on small sites in particular, the place I would start is things that overlap between ranking signals and display in the search results signal. So title is the place that I would start. Because if you remember before machine learning, our paid search colleagues, almost their whole job was optimizing the advert to get more people to click on it. It was like tons of people's jobs in the industry. But on the SEO side, I feel like folks have massively under-invested in click-through rate. Even if you don't rank better, even if you stay ranky in the position you are already in, you can double, triple your click-through rate by writing a better, more compelling whatever. And the great thing about the title is you might even rank better as well. And so that's why I did start on SearchPilot and that's where I'd recommend folks start if they've got a smaller website.

Crystal Carter:

Yeah. There's some great case studies. Adding a question mark into a title tag is a case study on SearchPilot and it's like a 5% uplift on that.

Will Critchlow:

We've got all kinds of little data points. I also would say, if you're in a small business space, trust your intuition as well. Make these things... Just take kind of a customer-centric approach. What's your target market? What are they actually looking for? Are you actually solving their problem or answering their question? When we get deep into this in our enterprise space, we end up running what we call full funnel tests, which is where we get to test SEO performance and conversion rate offer specific change.

So you can say this was positive for one, but negative for the other. How are we going to mitigate that problem or whatever? We can get deeper into that. Point being, there can be attention. And in particular, there can be attention... If you're a B2B desk, SearchPilot is a great example, we actually want to appear for things that in-house SEOs at very large websites are searching for. If we write something that gets a whole load of, I don't know, analysts or agencies searching for it, then our traffic stats might increase, but our business metrics aren't going to. And so again, we are running scientific tests on this stuff. If you are at the smaller size, that's where you have to trust the fact that you are the business, you know the business, you know your market, you know the audience, and you have to hone in on those things that are going to appeal to them and are going to work for them.

Crystal Carter:

And I think also certainly, what I've found, I wonder if you'd also recommend this for smaller businesses, one of the benefits of working at an agency or working with an agency is that agencies are able to test techniques across multiple websites at the same time. And so if you have a small business, and I know a lot of people who recommend having your main site and also having a test site, that's a similar thing or maybe something else where you can sort of test something in a sort of low, low-risk area. Is that something that you would also recommend for smaller businesses, that they also maybe have, particularly if they're SEO nerds, although I guess not everybody who's a business owner is an SEO nerd. Shame, but...

Will Critchlow:

It's a shame. One day. So where I would come from on that is test sites are useful to the SEO nerds. They're not particularly useful to the business. That's my kind of controversial, maybe controversial position. If I'm in the business owner's shoes, I'm saying, "Let's not waste our time messing around over there. Let's write some more blog posts or produce some more content," or whatever it might be. Where the test bed site is useful in my opinion, is answering questions for nerds like us. And you can get some great conference presentations out of it. We've probably seen Olly Mason's stuff, for example, where he'll do the kind of destructive testing. What happens if I make my site map take 30 seconds to load or whatever? And you're like, you don't want to do that on your money maker.

Crystal Carter:

No.

Will Critchlow:

So he's doing that kind of testing where he's saying, "I want to really figure out what happens with Google at these edge cases. If I do something that seems odd or that seems weird, can I poke at the algorithm in specific ways?" And I get answers out of that. I'm glad he's doing that work. I'm also glad none of my customers are doing it on their stuff. And so that is valuable work. And so that's the third kind of testing. So we have the kind of stuff that we are doing, the big kind of controlled experiment. We have the before and after type analysis that we talked about, which doesn't quite reach the threshold for scientific testing, but it's the best you can do in certain situations. And the third kind, which is kind of more like what Olly is often doing, is what I call kind of laboratory testing.

So you're taking a completely artificial situation and you're doing something weird and you're saying, "When I poke Google like this, what happens?" And that's like, I know I indexed every page on my site or I tried to use temporary redirects across the whole thing just to see what would happen and... Yeah, but we learn.

So like I said, I'm glad other people do those experiments and we can learn from them. In particular, that's where folks run these experiments with made up words. Right? So you make up some words, put them out on domains that have never been indexed before and see if does the link order matter, right? So on one website, you put the links in one order on the menu, and then on the other website, you put them in a different order in the menu and you see which one ranks for this made up nonsense word that nobody's ever searched for before. That's closer to doing science, right? You're kind of saying, Google is this black box and we really want to understand into the minutiae of what's going on. It's not always actionable. It doesn't always turn into, "I should do this on my business website and make more money." So it's interesting.

Mordy Oberstein:

I mean, I think if you're a business owner, that's kind of the mentality you should be walking away after listening to this podcast. You should always be looking at things, testing things, seeing what's working, what's not working, changing things around when it does make sense. Don't just change things around for the sake of changing things around. Don't change big things around for the sake of changing big things around because that will not end up well for you. But you do want to be in a learning mode and seeing what works, what doesn't work. Sometimes correlation doesn't equal causation, sometimes it does. I'll quote Barry Adams. "Having some data is better than having no data." So work with what you got.

Will Critchlow:

I think that's particularly true if in the small business world, you do your best to understand what's going on and you use your instinct and you use your understanding of the industry and the market, along with that data. And I think where it's dangerous if we're quoting random quotes at each other, there's that saying about some people use statistics like a drunk uses a lamppost for support, not illumination. And if you go looking for the answer that you think you want to find-

Mordy Oberstein:

You will find it.

Will Critchlow:

Especially in a small amount of data, you can find anything you want. You just look on different days, look on different... Keep digging into the data until you find that one thing that backs up the thing you hoped or you wished was true. And my experience at the small business end is that when stuff is good, it moves the needle enough that you can spot it. Right? You start getting more inquiries, you start getting better inquiries. Folks start to set, they'll call you up and they say, "I was reading this thing on your website." And it's that qualitative subjective feedback that layers on top of the stats in the small business world, for me.

Mordy Oberstein:

Yeah, for sure. If people wanted to learn from you, where could they find you?

Will Critchlow:

So I'm @willcritchlow on all the social places. I have historically, been most active on Twitter, but it's kind of a weird place right now, isn't it?

Crystal Carter:

It's somewhere with GA-III.

Will Critchlow:

So sad. So sad. I am on threads. I'm probably most active on a combination of Twitter and LinkedIn still. So yeah, I haven't quite figured out my long-term home, but I'm definitely more about the writing and reading than the taking photos and looking at photos. So I'm more likely to be found on the Twitter and LinkedIn.

Crystal Carter:

You're not on Instagram?

Will Critchlow:

I have an Instagram account. I think I have 10 photos on there. The last one was like two years ago. They're mainly pictures of my dog and that kind of stuff these days. Yeah, nobody's getting any SEO insight out of my Instagram.

Mordy Oberstein:

Okay. But they did see what you ate for dinner though.

Will Critchlow:

Three years ago. Sure.

Mordy Oberstein:

That works. Well, thank you so much. Folks, check out the show notes and check out all of links, all the links to Will's various profiles, probably not his Instagram profile. I don't think I'm going to throw that in there. If you really want me to, Will, I will put the Instagram. No?

Will Critchlow:

I think it's probably for the best that we leave folks to find that themselves if they really want to.

Mordy Oberstein:

There you go. Thank you so much.

Will Critchlow:

Thank you. It's been great. Thanks for having me on.

Mordy Oberstein:

Now, speaking of testing things, one tool that can help you do more than testing, but I also think it can help you do testing, is the Wayback Machine. What is the Wayback Machine? Let's go back and find out as we go tool time on the SERP's Up Podcast. The Wayback Machine is the coolest name for anything on the planet. Well, you will not find it at waybackmachine.com. It's internet Archive. I want you to be aware of this. If you're looking for like, "Hey, the Wayback Machine," I recommend you just Google it, as opposed to putting in wayback.machine.com. I don't know what that will give you.

Crystal Carter:

No. So the actual address is archive.org/web and that gets lead to-

Mordy Oberstein:

Yeah, Google Wayback Machine. You'll be good to go. That's how I find it.

Crystal Carter:

Basically, what the Wayback Machine is, and the first time I found out about this, I was like, "Oh my gosh."

Mordy Oberstein:

Right?

Crystal Carter:

Wow.

Mordy Oberstein:

It's amazing. I just showed this to somebody last week and they were, "What? What is this voodoo? What is this magic that you have?"

Crystal Carter:

Right. Right. So for those of y'all who are civilians in the SEO community, for those of y'all who are deep, deep in the weeds of SEO, the Wayback Machine is this fantastic, incredible archive of over 840 billion web pages saved over time. Some of these for years, and years, and years, and years. If you have a new client who's had a domain for a while, you can go into Wayback Machine and you can see old versions of their website. You can see old versions of certain pages on the website. And I've had this before where I had a client who came to me and we were just doing a migration, but they had recently had another migration a couple of years ago. And two years ago isn't that long in terms of a website migration. So I was able to see what their website not before, not the one that I was looking at now, but before that, what that was, and that can give you some insights into that.

What's really great and the reason why it's a great tool for testing and for understanding testing stuff is that if you're looking at the analytics and you're seeing that they used to have this great conversion rate on a particular page or they used to have tons of traffic to a particular page-

Mordy Oberstein:

What happened?

Crystal Carter:

What happened, why did it change? Well, you can cross-reference the dates that you see that activity happening, or the ranking or the whatever. And with the Wayback Machine, and you can see... So it says, okay, let's say in 2020, they were ranking really, really well for potato mashers. I don't know why we're mashing potatoes today, but that's what we're doing.

Mordy Oberstein:

I'm making potato salad for lunch.

Crystal Carter:

Okay. You're mashing potatoes for potato salad? You're chopping them. You're cubing them.

Mordy Oberstein:

Okay. So that's the ones I know about potato salad.

Crystal Carter:

We're cubing the potatoes. We're not using the potato masher for the potato salad. We're mashing potatoes for mashed potatoes. Okay. Anyways, maybe we're doing stuffed mashed potato skins. You mash them, and then you put the skins back in with other tasty treats.

Mordy Oberstein:

I don't know what that is.

Crystal Carter:

Okay. I'm distracted. Anyway, okay, so what you do, so you have a potato masher webpage, and let's say it used to convert really, really well in 2020. And now, it's not converting really well or maybe it dropped off a cliff recently. And if someone saying, "We used to do really well with this page. I don't understand," you can go back to 2020 and look at the potato masher page and see what was on that page and maybe it used to have the picture right at the top. And now, it has the picture further down the bottom and you can go, "Okay, well, it seems like this is an issue and we should fix that. It's a fantastic tool for illustrating those kinds of changes. And when you compare it with traffic and when you compare it with activity, it can be really, really useful for helping you to understand things that you can do with the website, things that you used to do with the website, things that maybe you should bring back, things maybe you should burn with fire, whatever you need to do.

Mordy Oberstein:

It's really great for contextualizing. Essentially, that's what it is. It's a way to contextualize if you have a new client and you want to go back and see. Okay, they have a whole ranking history. I can see the ranking history. What happened? What went on here? For a lot of the things that you're going to look at, particularly the content, you can go back and see what happened. If all of a sudden they changed their content, it's right around the same time as the helpful content let's say, comes out. There you go. That's the first place to start looking.

If it's a competitor, what happened? All of a sudden, their rankings are up and what do they do? What happened? What's going on? You can take a look. I've used it to analyze content trends. I recently did this to... There's whole focus on experience in content and rankings using the actual experience. And I went back and looking at a whole bunch of product review sites. Seeing how many times he used the word, our and we two years ago, three years ago, how often they're using our and we now, a lot more now by the way. So for those kinds of things, it's amazing.

Crystal Carter:

It's really useful. And I think it's also really great when you're looking at your competitors because sometimes if you've got a client for instance, and they're on your vertical and they've said, "We've done this, this, and this, and this website started... We fell behind at this time," you can go and you can look on their competitor's site and say, "Okay, did they have a migration at that time?" And you can say, "Does this website from two years ago still look the same as it does now? Does this website from six months ago still look the same as it does now?" Because you can't really call up the competitors and go, "Hey, can you just tell me your full web history?" They're not going to tell you that.

So you can go onto Wayback Machine and you can look that up and it's super, super useful. There's an API that's attached to it as well that can help you to understand. And also, you can see information about... Sometimes you can see things like the site map even on Wayback Machine, which can be useful and you can see different elements of it. It's a super useful tool. Every year, they say, "Would you like to donate money to the Wayback Machine?" And every year, I'm like, "Yeah, you know what? You guys can have a fiber." Because there's been so many times on the Wayback Machine that it has saved my bacon.

There's a situation though... I think the one that really stood out for me was I had a client and we did a migration. And suddenly, their pages weren't getting indexed at the same rate. And I was like, "We need these pages indexed. They need to be indexed now. Why aren't they being indexed?" And I compared the new homepage to the old homepage, and the old homepage had a feed of the new content. And the new homepage did not have a feed of the new content. And I was like, "Y'all, do you see this?"

Mordy Oberstein:

Feed the content.

Crystal Carter:

Yeah. And they were like, "Oh, okay." So they put the feed on and everything was exactly as I expected it to be. So absolutely, use Wayback Machine to solve issues. It's particularly useful for migrations and it's also useful for trying to fill gaps maybe between what a client knows. Because sometimes, you get a client and they know the full history of the website. Sometimes it's a brand new marketing manager and they have no idea what did or didn't happen before. So yeah, it's really, really useful.

Mordy Oberstein:

Cool. Go back. See previous versions of web pages. Not all of them show up because if it's a random, random, random page, they didn't archive it, but a lot of the ones that you would think wouldn't be there, are there. And the best part is, you don't need a flux capacitor and going 88 miles an hour in DeLorean to go back in time. You could just go to the Wayback Machine. Now, speaking of the Wayback Machine, I was just looking at somebody recently who was using the Wayback Machine. It was Barry Schwartz using the way-

Crystal Carter:

Really?

Mordy Oberstein:

Yeah. Maybe he wasn't. Maybe he just had it saved. I'm pretty sure he is using the Wayback Machine to see, and he used it to see changes in Google's guidelines.

Crystal Carter:

So Barry Schwartz is himself or the-

Mordy Oberstein:

A Wayback Machine?

Crystal Carter:

Yes.

Mordy Oberstein:

Are you saying that because his dress and design style, that's what I'm looking for, is not up-to-date?

Crystal Carter:

Barry, I didn't say that. Okay. What I'm saying is that I've done research on featured snippets. And if you want to, for instance, see research on featured snippets, he's been covering featured snippets the entire time. So for decks and stuff, I very often reference photos that he shared of different things from 2016. This is what a product carousel looked like in 2016, and now I can compare it to what it looks like now. They'll share things from like, "This is what this looked like, this is what an instant answer looked like in 2014. And I can compare it to what it looks like now." Barry Schwartz is an incredible-

Mordy Oberstein:

It's the way back machine of the SERP.

Crystal Carter:

Of the SERP, precisely. And it's really useful for understanding, again, if you're thinking about there's traffic changes, why does my traffic change? If you're looking and you're looking at some of the SERP photos of what it said, what it looked like in 2018 versus what it looks like now, then you can say, "Okay, well, that's our future changed." And I have evidence of this, thanks to Barry Schwartz.

Mordy Oberstein:

What we're basically trying to say is, it's now time for the Snappy news as brought to you by Barry Schwartz. So here's the snappy news.

Snappy news, snappy news, snappy news this week on trifecta from none other, but Barry Schwartz. This one coming from Search Engine Land. Google Search now supports discussion form and profile page structure data. So basically, Google is on a first person first hand knowledge, first hand experience kick as well they should be. The E for experience in E-A-T being a prime example of this, but Google has said they want to show they haven't made a couple announcements trying to show more first hand knowledge on the SERP. This structured data would allow such content from forums to show up as a rich result. So Google said, "that the functionality allows them to show" first person perspectives on social media platforms, forums, and other communities.

By the way, I think Google needs to go beyond social media for this content, which is why I think they're focusing on forums here with the profile page structure data. They know they need to go beyond social media and actually have content specifically generated for this kind of purpose. The "Search Engine Land," the new profile page structure data and markup is for any site where creators, either people or organizations, share first hand perspectives. The markup will help Google highlight the creator's name or social handle, profile photo, follower account, or the popularity of their content in the Google Search results. So if someone on your forum, you want to make sure that they're kind of prominent because Google's going to pull on their profile information. I think forum SEO is going to be a thing. Google is very serious about showing first hand knowledge on the SERP. It's a wider content trend that if Google wants to deal with, I don't know, TikTok, is going to have to do. Google's made a variety of announcements kind of showing that they are going to do that, this being one of them.

First hand knowledge, first person knowledge experience is going to be a thing, just like forum content SEO, which I termed I have just coined, is going to be a thing. Next up from Barry Schwartz, this is from Search Engine Roundtable. Google Discover showing older content since follow feature. This was spotted by Glenn Gabe. If you don't know who Glenn Gabe is, please follow Glenn Gabe on Twitter, X, whatever it is. And Glenn basically said that he's noticing more content showing on the Discover feed that is not from the last day or so. So generally speaking, Google Discover shows current event-ish kind of content. So the content is generally very fresh, a couple of hours a day, maybe two days old. Once in a while, it'll sprinkle in a four-day-old post, but Glenn's noticing that he's getting articles that are two months old.

I personally noticed that the last month or so, my feed has changed. The topics have sort of opened up a little bit. The type of content that Google shows in my personal Discover feed is opening up a little bit. I wonder if Google is making some more foundational changes to the Discover feed as it tries. Because again, Discover feed is a great place where Google can kind of be a sort of social media-ish kind of content provider. So I wonder if they're going to make some significant changes as time goes along, and wider preferences and content consumption trends change.

Okay. Last up from Barry Schwartz, this one from seroundtable.com. Google November 2023 core update rollout is now complete. So as of the time of this recording, the November 2023 review update is still rolling out, but by the time you hear this, it might be done. So check your local news provider, AKA Barry, to see if that is finished. But check your rankings because the core update from the November 2023 core update rather, is done rolling out. I have some fresh Semrush data in my hands. It looks like generally speaking, the November 2023 core update was slightly more comprehensive, powerful, potent, however, you want to volatile, however, you want to phrase it. Then the October 2023 core update is a little bit tricky to pull the data out because you could only look at the really very beginning of the November 2023 core update. Otherwise, you start getting that mixed, that volatility/ranking data mixing in with the November 2023 review update, and that makes it hard to decipher.

So the beginning of the November 2023 core update looked to be a little bit more impactful overall, but that needs to be explained or qualified than the October 2023 core update. Check your ranking is what I'm trying to say in a nutshell. And with that, that is this week's snappy news.

The best part is, I'm going to drop the fourth wall here. We recorded the main section of the podcast before we actually recorded the news. We're waiting for the last second to see what actually broke in the news, but we're so confident that Barry's going to be the one covering it, that we tease out Barry beforehand. So thank you Barry, for that.

Crystal Carter:

Thank you.

Mordy Oberstein:

Now, speaking of SEO testing, going back full circle to where we all started, there's somebody we used to be following on social media because she's done some pretty nifty things around SEO testing. And she is Maria Amelie White over on Twitter. It's @Maria_Amelie on X/Twitter. We'll link to her profile in the show notes, but Crystal's got a great thing about what she tried doing with Spanglish.

Crystal Carter:

Yeah. So Maria is an in-house SEO at Kurt Geiger. I'm not jealous at all. But yeah, she's an in-house SEO. She's super smart, super clever. And I was on a discussion with her and she was talking about a fantastic test that she did around SEO and PPC that was around language. So I think her team was advertising in a Spanish language market and also in an English language market. And they found that sometimes some of their Spanish judges shown in the US, for instance, and also in Mexico for instance, where there's a border. But she also said to the team, she was like, "Hey, this is something that I've seen. What about Spanglish?" And they were like, "Oh, I don't know." And she said, "No, we should test Spanglish." And she found that... And a lot of people are searching this way, and if people speak this way where there's sort of areas where people are speaking multiple languages, there's plenty of countries like that, and she found that they got some great results from incorporating Spanglish into their ads.

And I think that that's something that Will was talking about as well, is that sometimes it's worth following your instinct as well. So that was something that she knew from lived experience and she was able to apply that, and to test it and to get great results. And I think that with regards to testing, if you've got a hunch, absolutely test your hunch. And she got some great results from that, and that was one of the reasons why I thought of her. She's also super knowledgeable and is really active in the community.

Mordy Oberstein:

Very active in the community. Writes for Search Engine Land. You can check all of her articles there. Look for her Twitter profile in our show notes, and then read her articles across the web. Thanks, Maria. By the way, Spanglish is a terrible Adam Sandler movie. I'm literally here like, why does Spanglish sound so familiar? This is the third time you mentioned that case to me before. I'm like, "Something's not right." I'm like, "Oh snap, it's an Adam Sandler movie from years ago."

Crystal Carter:

When you say it's terrible, do you mean that it's terrible within the pantheon of Adam Sandler movies or that it's terrible in full stop?

Mordy Oberstein:

No. He's got some great movies. Uncut Gems, that's a gem right there.

Crystal Carter:

I don't know that one.

Mordy Oberstein:

It's great.

Crystal Carter:

I mean, Happy Gilmore. I'm there for that one. I'm there for... Oh gosh, there's a few other ones.

Mordy Oberstein:

The Waterboy?

Crystal Carter:

Waterboy. It's terrible because I don't think it's one of his best, but I even like Little Nicky.

Mordy Oberstein:

Oh, that's the devil one. I barely remember that. Then there was the one where-

Crystal Carter:

It's like a basketball skit.

Mordy Oberstein:

... he adopts a kid.

Crystal Carter:

Yeah, yeah.

Mordy Oberstein:

Big Daddy.

Crystal Carter:

Yeah, yeah. He's also done some good animated stuff recently.

Mordy Oberstein:

Right. Crazy Eight Nights. That's a classic in my household.

Crystal Carter:

There we go. There we go. He's got some good stuff. He's got some good stuff.

Mordy Oberstein:

He's got great stuff. He did a bunch of stuff like an NBA movie kind of thing, where he's a scout. That was a great movie. All right. All right. I think we're off the rails. No, no. You got one more. Go for it. You've got one more. Just do it.

Crystal Carter:

I can't think of the words. I can't think. I was just thinking of as far as terrible Adam Sandler movies, there's one where they go camping or something. It's awful. It's got a great cast, but the movie is fundamentally awful.

Mordy Oberstein:

There was a point in his career where it was kind of hit or miss. And I think lately, he's been consistently awesome.

Crystal Carter:

Yeah.

Mordy Oberstein:

Thanks, Adam. Well, thanks for joining us on the SERP's Up Podcast. Are you going to miss us? Not to worry, we're back next week with a new episode as we dive into. I'm not telling you, it's a surprise. It's a very special episode. That'll be a live recording from one of SEO's biggest conferences on the planet. That's right. Look for it whenever you consume your podcast or on our SEO Learning over at wix.com/seo/learn. Looking to learn more about SEO? Check out all the great content and webinars on the Wix SEO Learning Hub at, you guessed it, @wix.com/seo/learn. Don't forget to give us a review on iTunes or a rating on Spotify. Until next time, peace, love and SEO.

Related episodes

Get more SEO insights right to your inbox

* By submitting this form, you agree to the Wix Terms of Use and acknowledge that Wix will treat your data in accordance with Wix's Privacy Policy

bottom of page