top of page

Making bot log SEO data easy (easier?)

Ever wondered why Googlebot loves your blog's cat photos but totally dismisses your high-value product pages?

Wix’s Mordy Oberstein and Crystal Carter are joined by Roxana Stingu, Head of Search and SEO at Alamy, to dig into crawl patterns, bot behaviors, and crawl budgets. Learn how to make your pages better for bots, update old content to be relevant again, and minimize server costs by managing bot activity.

Discover how specialized bot logs can offer insights into everything from quality issues to market trends and detect patterns of malicious bots you may want to block.

(Best Shatner Impression) “The robots are interacting with our site, but how?”
Don’t miss out as we navigate the world of search engine bots on the SERP's Up Podcast!

Episode 102

|

September 18, 2024 | 39 MIN

00:00 / 38:40
Making bot log SEO data easy (easier?)

This week’s guests

Roxana Stingu

Roxana has a strong background in technical SEO, including enterprise SEO, image search, and ecommerce site search. As the Head of Search & SEO at Alamy, her diverse skill set is dedicated to making it easier for users to discover products across various platforms, ensuring they can find exactly what they need with ease.

Transcript

Mordy Oberstein:

It's the new wave of SEO podcasting. Welcome to SERP's Up. Aloha. Mahalo for joining us on the SERP's Up Podcast. We're pushing on some groovy new insights around what's happening in SEO. I'm Mordy Oberstein, the Head of SEO Brand here at Wix. And I'm joined by she who has all of the information you need in a highly visual and pointed way, the Head of SEO Communications here at Wix, Crystal Carter.

Crystal Carter:

Hello. I am the Crystal Carter who does things.

Mordy Oberstein:

With visuals also.

Crystal Carter:

Sometimes.

Mordy Oberstein:

At the same time.

Crystal Carter:

And then also sometimes a-

Mordy Oberstein:

What are we even talking about?

Crystal Carter:

... because today we're talking about robots.

Mordy Oberstein:

We always talk about robots. It's an SEO podcast.

Crystal Carter:

I'm going to stop them from crawling over here.

Mordy Oberstein:

No, don't do that.

Crystal Carter:

I'm going to make them crawl over there. I'm going to watch where they go.

Mordy Oberstein:

Okay. Foreshadowing. Foreshadowing. The SERP's Up podcast is brought to you by Wix Studio, where you can always subscribe to our SEO newsletter, which comes out each and every month. It's called Searchlight, over at wix.com/seo/learn/newsletter. But where you can get your Botlogs in a highly visual, seamless way right inside of Wix and Wix Studio, more on that later. Well, not more on that topic, more on those reports later, as this week we're diving into Botlogs. Check every call and log every hit. No, we're not talking baseball, but Botlogs. How and why Botlog reporting factors into SEO, how to use Botlog analysis to drive SEO success, and how to make all that, much easier than I'm making it sound. To help us navigate our way through this perhaps uncharted part of your SEO universe, Roxana Stingu, the Head of Search and SEO at Alamy will be here in just a bit.

We'll also explore a highly visual Botlog tool that simplifies the entire Botlog SEO process for you. What tool could that be? I kind of mentioned already before. Oh, no. And of course, we have your snappiest of SEO News and who you should be following on social media for more SEO awesomeness. So Captain's Log, Supplemental, the robots are interacting with our site, but how? Unknown. Our only chance of returning from this quadrant of SEO is an unknown entity who refers to themselves as, "The Botlog." Though they hardly resemble a log at all, we're hoping it can help us branch out from our usual SEO analysis on this, the 102nd episode of the SERP's Up Podcast.

Crystal Carter:

I wasn't expecting the Shatner. I'm not going to lie. That was unexpected.

Mordy Oberstein:

In a good way or a bad way?

Crystal Carter:

I don't know.

Mordy Oberstein:

Oh, no.

Crystal Carter:

Well, because you started and I was like, "Oh, this is amusing." And then it just kept going. I was like, "Wow, he's really-"

Mordy Oberstein:

I actually researched. I read through various captain's logs to see like, hey, what does he talk about in there? Yeah, I really did my homework on that. I went full nerd.

Crystal Carter:

I mean, that's what we're here for. That's what this is about.

Mordy Oberstein:

That's what makes this podcast special and better than all the other podcasts.

Crystal Carter:

You're at home. It's fine. It's cool.

Mordy Oberstein:

We do our thing here. We do our thing. Anyway. Anyway. Botlogs. The mere term sounds like it's frightening and/or confusing, but it doesn't have to be.

Crystal Carter:

It doesn't have to. That's one of the reasons why we've got Roxana talking about it. Because when I've heard Roxana speak a few times at a few different events, I've known Roxana for a few years now by Women in Tech SEO and she's wonderful and fantastic, which we'll find out very shortly. But Roxana is really great at taking really complex stuff and making it not sound impossible, which I think is super important because I think that everything is accessible. So I am so happy to be talking to Roxana about this today.

Mordy Oberstein:

With that, welcome to SERP's Up Roxana Stingu. How are you?

Roxana Stingu:

I'm good, thank you. No, sorry, I can't do that-

Mordy Oberstein:

We should do the whole episode like that, right? Everyone will love it.

Roxana Stingu:

We could try, but I don't think you're going to get a lot of people listening to this if I do that.

Mordy Oberstein:

You have two options, either we talk like Botlogs or bots or like William Shatner for the entire episode. Your choice.

Roxana Stingu:

I can't do either, but I can talk about Botlogs-

Mordy Oberstein:

And William Shatner?

Roxana Stingu:

A little bit.

Mordy Oberstein:

Okay, fine. Well, just the Botlog then.

Roxana Stingu:

Hello, everybody. Official hello.

Crystal Carter:

So Roxana, just for folks who are new to meeting you, can you just give them a little bit of your background, from where you're coming at this topic from?

Roxana Stingu:

Yeah, sure. Quick intro. I'm Head of Search and SEO for Alamy, where search is not paid search, but actually website search. So I get to work with my own search engine, which is great. And Alamy is a massive website. We have about 400 million product, and I'm mentioning this because the bigger the website, the more you care about log files and want to have a look at what bots are doing on that website. And we're going to talk about why that's important, but this is pretty much why I am so interested in this topic and why I like it so much. And even if you're not working on a 400 million product website, don't worry, there's still a lot of information in those files for you to get and be able to use that insight to further improve your website's presence in bigger search engines like Google.

Crystal Carter:

And that's super important. And for folks who don't know, bots are like little computer kind of things that they come to your website and they look around and stuff. So we have Googlebot, which crawls your website and sends information back to Google. There's Bingbot that also sends things. There's an AdSense one. Pinterest has one called PinBot. Then there's other stuff, like tools, like Ahrefs for instance also has their own bot that crawls around and does all this sort of thing. For a website that's your size, are there particular bots where you're like, "No, get away from me. Back up"?

Roxana Stingu:

Yes.

Crystal Carter:

Are there?

Roxana Stingu:

Absolutely. So when a website is this big, every request that comes from a bot is costly. And that's mostly because pages are dynamic, so every time a request comes in from a bot or a person, it doesn't matter. We need to recreate the pages from the server and then the server uses a database and that database incurs costs because we're getting that information from there. So pretty much every time anybody's requesting a page, we pay for it and it adds up. And the more bots you have crawling, the more it adds up. So then what I do is I tend to look at who's crawling me the most and what value am I getting out of that. So for instance, Google is crawling like crazy and I get value from that. I get organic traffic and that traffic then converts for me, so I'm getting revenue out of it.

So I'm happy with Google crawling. Same for Bing and Yandex and Baidu and other search engines in countries where I want to have visibility with this website, but we live in an AI era and everybody is now crawling for information to put in their large language models and train whatever they want to train. So that is a problem because I'm paying for their training in this case. They have to request my content, so I'm incurring a cost, but I'm not seeing a benefit out of it because it's their model that they're monetizing or whatever they're doing.

So for me at this point, looking at crawls from AI related bots, that's kind of the biggest area. And the problem is some of them will have descriptive names and you'll recognize them as being various companies that I'm not going to name and shame, but some of them use third party bots that don't resemble the name of the company using them. They're like bots for hire. And I think those are the ones you want to have a lookout for and block because you're really getting nothing out of it. You don't even know who's hired them.

Crystal Carter:

Right. That's really interesting. So people who are new to the concept of robots and crawling and all of that sort of stuff, you were talking about how it costs your server, it costs server time. So essentially they're calling the page when they're crawling the page, and so that's triggering server response. You're talking about much, much bigger sites, but even on smaller sites, I've seen it where somebody sent a bot to there and it's a junk bot, it's a spam bot or whatever that's coming through and it's causing tons and tons of traffic to the site that's messing up your analytics; It's causing server issues and things like that. So yeah, it is really important to pay attention to who they are, where they are, even from a smaller site. But also when thinking about these AI considerations and all of that sort of stuff. I guess I don't want to get into spilling all of the company details or whatever, but have you ever had to take immediate action to block people who are behaving in ways they shouldn't?

Roxana Stingu:

Yeah. It's part of security practices. You always do it. You look at malicious kind of requests, that's what we call them, so there's patterns to them. And you notice there's a big wave of requests and then it goes down and it's periodical and you can kind of see that pattern, and you know it's unnatural. To quote from my favorite movie, "It's unnatural, mate." Sorry, that accent, see, I can't do that. But you kind of notice these and you think, why are they crawling me? Is this a reputable bot? Because you have IPs of Googlebot and other bots and you can verify it's them and it's not somebody else.

Or other times you get what's called DDoS or a denial of service attack. So that's when somebody starts sending so many hits to your website, your hosting can't deal with that. So then everything kind of freezes in your server is just refusing to connect to anything, so then your website's down even for your users. And this is not about large websites, it's about the bandwidth that your hosting will allow in terms of connection. So it's been in the past that I worked to small blogs that had low bandwidth hosting and I would create a fake DDoS with just the crawler because I was crawling too fast. So sending too many requests to it per second that the hosting just couldn't handle it. And I think small business websites, personal blogs, things like that will not go for a very expensive hosting package because there's no reason to. But they could be the victims of these DDoS because it's really easy to create them if your hosting doesn't allow a lot of hits to come through.

Mordy Oberstein:

Another reason why using Wix is great, because we'll take care of all that for you and your server won't get overloaded because we optimize the server network for you.

Roxana Stingu:

Exactly. One less thing to worry about.

Mordy Oberstein:

Two less things to worry about among other things. But thinking about small businesses, one of the other ways that I think you can think about using your botlux is understanding Google's behavior. Where are they crawling on your website? Which pages are they crawling on your website? And is there a problem? For example, I had a situation one time where there was a massive redirect done on the site and there was a glitch somewhere and you could see Google ignoring the redirect and going to the old page and not the new page. So you can take a look at your Botlogs and say, "Wait a second. I thought that was all good. Everything looks like it's fine, but there might be an underlying problem here because Google's ignoring it and they're going to the old page."

Roxana Stingu:

Yeah, that's absolutely one of the reasons why you should even go through these files. They're just like, you can export them as text files. It's just lines upon lines upon lines of who requested, like the referrer. Who requested what, the page on your site, and then information like the IP of the person or the service or the crawler requesting it. And other information that might be useful or not to you, like the browser, for instance, and the HTTP status code that come back, did that service get the page? 200? Okay, did it get a server error, a 5xx? What's going on? But the reason to go through all that information, because it's going to be a lot, is exactly as you said, to identify points where things are not working.

I mentioned small businesses before and they tend to have a calendar for reservations on their website. And those tend to create infinite spaces because for every single combination of day, month, time, whatever, and especially in the future because the calendar goes forever, you create a page or you create a parameter that creates a new URL and Google and other bots can just get stuck in that and they literally just go in an infinite space and can't come out. So you are getting all these hits as if Google is identifying all these millions of new pages on your website when your site might just have 10 pages.

Crystal Carter:

Right. And I think that people often don't understand the connection between that and your marketing activity or your other activity. So if Google's finding that, then that might be stopping Google from indexing the rest of your content, that might be stopping Google from completing their crawl, and that might mean that pages that you're expecting to be indexed aren't being indexed. And let's say those pages are the things that you're trying to sell. Maybe those things, the pages that aren't being indexed are the core of your business.

Roxana Stingu:

Exactly. You are making an update with the new offer, but Google's too busy in that infinite space. Instead of going and indexing the information about your new offer and showing your new maybe title or description that can convert that click.

Mordy Oberstein:

And also if quality is a domain level metric, right? So let's say with a helpful content system, which is now part of the core algorithm, they're looking at helpfulness across the entire domain. If they're not seeing your entire website and they're only seeing X, Y, and Z pages, that entire score is built up on X, Y, and Z pages and not the entire corpus of content on your entire website, which is not what you want.

Roxana Stingu:

Yeah, exactly. And I'm sure people heard about crawl budget before, and I think looking into log files, you can kind of see where that budget is being allocated. So for people who haven't worked with this term before, imagine you have a finite sum of money like all of us have when we get our payment at the beginning of the month or end of the month, and then you can allocate that money towards different things. You can put more money in food or more money in fun, but then you can pay rent. Google does something similar where they can put more crawl towards certain types of pages or other types of pages and it kind of has to find a balance in your website. And if we put too much money towards fun, that's great for us, but we're not really getting the value because then we starve.

It's the same with Google. If it puts too much crawl towards pages where there's just errors, the pages don't load, they're really slow, their quality overall is low, it will stop putting money there because there's no value. So it will either shift the budget elsewhere or understand that maybe it needs to spend less with your website because it's not that good. So again, crawl log files can help you with this because if you segment your page types, so if it's a bigger website, you might do it by template, let's say. So you have category pages, you have product pages, you might have some, I don't know, blog pages, whatever you have, you segment by that. And then you look at how many hits am I getting from whatever search engine you're analyzing in these pages.

And has that behavior kind of changed in time? Am I seeing a reduction that kind of matches maybe a core updates? All of a sudden I'm thinking, well, Google doesn't find it as quality as it did before. Maybe I need to up my game on these pages. Do something, right? And that's the thing. Quality will get stricter and stricter with every update because the internet just gets bigger and bigger, so Google needs to keep it clean. So if you notice this reduced crawling behavior in your pages, even though your number of pages is the same or higher, maybe kind of focus on this. Maybe that template needs a boost somehow, so try to understand what a quality boost would be in that case.

Mordy Oberstein:

Great point.

Crystal Carter:

I think also the other thing, you talk about how often people are crawling. Is that something that you look for in a Botlog, like how often Google's coming to the site? How often Bing is coming to the site? How often Yandex is coming to the site, for instance?

Roxana Stingu:

So that matters as well because if your content is interesting or of interest, which is different from being interesting to you, then you will notice that bots will come and crawl more frequently because they want to make sure they have the latest version of it because it's of interest. If you run a news website, you'll get so many crawls, it'll be insane because news is all about freshness. So then this is why Google recommends you have a news sitemap if you have a news website because then crawl patterns will change because it's more time-sensitive. But regarding this crawl frequency, have a look at pages that get a lot of crawls and very frequent ones.

And then think, are these pages actually driving a lot of traffic or are they being crawled a lot but driving no traffic? Because then why is there an interest in getting updated information from pages that drive no traffic, right? So that might be a reason for you to look at those pages and try to understand, is this something I want to show to my users? And if yes, why isn't Google showing it to users, so no traffic? Or is this something like one of those spaces we talked about where just parameters which are duplicates of other pages or subsets that maybe I should just block and not crawl anymore because users don't need to see this, why am I allowing bots to see it? And then we use robots.txt and you block that.

Crystal Carter:

If you're seeing that it's crawling then and Google's kind of interested in it, if it is a page that's maybe an older blog or something like that, maybe that's a candidate for updating, for instance. Maybe we can update this and maybe can look at the... To make it so that it's indexable because they're already interested in it. They know where it is.

Roxana Stingu:

Yeah, exactly. If you're 2020 guide is still getting a lot of crawls, then it might mean that the topic of the guide is of interest, but information is outdated. So exactly as you said, go update that. Make it a 2024 guide and you might attract even more traffic then.

Mordy Oberstein:

Yeah, even look at which part of the website Google tends to be crawling more often. You have products and you have a blog and you're fundamentally trying to use the blog to get people to the product pages because that what your website is actually earning money on. But if Google's crawling your blog way more frequently or not crawling your product-oriented content a lot at all, maybe you have a problem there. Maybe you need to interlink better. Whatever it may be, you need to understand that Google's seeing you as a blog website, not as a commerce website.

Roxana Stingu:

And here's where we get it wrong. We use crawlers and we always start the crawl from the homepage. And even though we use a Googlebot user agent or whatever, we think that's how search engines will crawl us, but that's just the one crawler that we're using. By using log files, it can actually see how search engines crawl because they don't always start from the homepage. They can start from a random page and then the priority they give the URLs they found and how they crawl might be different from the priority a crawler gives. So then it's really not the same thing and it should be comparing the two.

And if you're kind of seeing the same stuff, great, then you don't need to do that comparison all the time. But if you compare crawl coming from a tool with where the main hits go to from your log files and you see major discrepancies, then you have to kind of consider weight. Why is it so different for bots than it is for my crawler? Why are bots not crawling these other links or URLs? Maybe they're too hidden in the page, maybe I already have so many links they give up. It gives you ideas of how to analyze a page and figure out what's not working.

Crystal Carter:

And I think in terms of comparison, one thing that I've looked at and Google's like, "Yeah, we're mobile first. Mobile, mobile, mobile." But then I see properties and it says that it's the desktop crawler, like in Google Search Console, it says the desktop crawler, whatever and things. And I'm like, "Y'all, really?" And then when I go into my Botlogs, I can see that the mobile crawler is not crawling me very often. The desktop crawler is crawling me more. Are you comparing the different bots to optimize accordingly in your day to day?

Roxana Stingu:

I have good news for you. Google is killing off the desktop crawler this month. It's going away. It's out.

Crystal Carter:

Okay, that's it.

Roxana Stingu:

So you're not going to see it anymore. Well, at least not-

Mordy Oberstein:

I'm going to miss. It's sad. Used to hanging out. We used to have a beer once in a while. All right, well, I guess that-

Roxana Stingu:

I know. It used to be fun, but good news is you're getting a crawl reduction because it's going away, because it's like duplicate crawl. You are getting crawled by the desktop and the smartphone one, and now the desktop one is going away. But I'm assuming it's going to be small percentages for people because Google has been mobile first, so crawling more like that. But there's other Google bots and not just Google, but other user agents from search engines that you need to keep an eye on. So for instance, you can see major spikes from Adsbot even though you don't serve ads, and that can take up a lot of bandwidth. And it's a good idea to keep an eye on that and just kind of monitor it, especially if it's not useful to you and you have an Adsbot robots.txt specific where you can say, "Right, I'm allowing you to do this, but not that." Or you just use your robots.txt where you say, "Right, Adsbot, I don't want this. Go away." So there's options there depending on if you have ads or not.

Another thing that's interesting is that Google will crawl images with a different bot, and that's a bit slower than your regular HTML bot. So if you have an image heavy website and not seeing those crawls come in, give it a few weeks. But after that you should definitely be seeing that. And again, analyze the patterns there. If Google's not really crawling your images or doesn't really care about your images, maybe you should assess what your images are, because they might not be that useful.

Crystal Carter:

I think also one thing that's really interesting is, Mordy has the SEO Brand Podcast web page, I have a couple of other. I've got a little space site that I have, and I have my personal site or whatever. Neither of those have podcasts on them. I've looked at the Botlogs for Mordy's podcast site. He has a completely different set of bots that come to his website. He's got a completely different crew of robots that orbit his site-

Roxana Stingu:

Exactly.

Mordy Oberstein:

Those are my homies.

Roxana Stingu:

Your homies, yeah. But it's the same as with the Adbots. Once in a while Google will send all these different bots to discover, have you added a podcast in the meantime? Have you added more images? Have you added advertising? So you will see these hits once in a while and you should probably let that happen unless they go wild when you don't have podcasts, but you're getting half of your crawls from a podcast bot. You don't want that. So it's good to understand all the different bots and what they do and let them be if it's low values, because that's how search engines discover the web and changes to the web. But if they start being problematic and you don't have that type of content, just block them.

Crystal Carter:

And that's something you can do in your robots.txt?

Roxana Stingu:

Absolutely.

Crystal Carter:

And we have content on that, on the Wix SEO Learning Hub, which can help you learn all of that stuff. Roxana, This has been such a fantastic discussion. I've absolutely loved geeking out with you on this. Thank you so much for joining us.

Mordy Oberstein:

I, as well.

Roxana Stingu:

Always. Always for geeking out.

Mordy Oberstein:

Sorry for going all nerd on you earlier with the William Shatter thing. My bad.

Roxana Stingu:

It's acceptable. It's fine.

Mordy Oberstein:

Thank you.

Roxana Stingu:

I'll take it.

Mordy Oberstein:

Okay, so you're more of a card person, I get it. I understand.

Roxana Stingu:

Yeah, I am.

Mordy Oberstein:

Okay, all good. All good there.

Crystal Carter:

Well, thank you so much for making it so today, and I think-

Mordy Oberstein:

Oh, where do people follow you?

Roxana Stingu:

I'm on X. I almost called it Twitter. I'm on X. It's just roxanastingu, one word. And same thing on LinkedIn.

Mordy Oberstein:

Awesome.

Crystal Carter:

Thank you so much for joining us.

Roxana Stingu:

Thank you very much for having me. This has been fun.

Mordy Oberstein:

Bye.

Roxana Stingu:

Bye, everybody.

Mordy Oberstein:

So you might be thinking like, I love Botlogs at this point. They're great, they're fantastic. You might also be thinking, Botlogs? That sounds complicated. How do I set those up? How do I do those? So good news for both you who love Botlogs and you who think Botlogs? That sounds complicated. How do I do that? Because we have our own Botlog reporting for you as we go tool time. So for those lucky folks who are using Wix, you have built in Botlog reports as visuals and they're awesome.

Crystal Carter:

And you don't have to ask a dev for them.

Mordy Oberstein:

No. Or connect this or connect that. You don't have to do anything. You just have to click on Analytics, go to SEO and click on Botlog Reports.

Crystal Carter:

Right. Go to the search bar, type in bot traffic over time, and you'll be able to find whatever you need. It really is genuinely fantastic.

Mordy Oberstein:

Can I say a salty point?

Crystal Carter:

Sure.

Mordy Oberstein:

Okay. For those like, "Oh, I like work because I get to control the server." Outside of locking yourselves out, leaving that aside for a minute, to me it's always opportunity costs. It's not either good or bad, it's whatever you need. This is the opportunity cost of not having control over the server and us having control over the server. Because we have control over the server, we automatically create Botlog reports for you because it's our server.

Crystal Carter:

And we have eyes on lots of different bots and can identify them. So I'm looking at the one from my private website, which to be honest, doesn't get tons and tons of traffic and just sort of does what it does. And on it, I can see the bot for Baidu in the Botlog Reports. Basically if you go to the Wix Botlog Report, and if you want to find out more about this, we have an article on the Wix SEO Learning Hub by one Mr. George Nguyen, link in show notes, who gets into lots of some of the details there. But the kind of bots that I'll see on my private website are going to be different from the bots that I'll see on say the Wix's SEO Learning Hub.

I think we mentioned this in one of the other parts of this podcast as well, they're very different from the bots that Mordy gets on his podcast website, for instance. So my personal website doesn't have a podcast, so I don't get podcast bots on my website, but Mordy's gets tons. Tons of podcast bots. Bots I didn't even know existed. And I think that one of the things that's really interesting about this is that it can help you figure out, like Yandex is a bot that shows up on my site; HubSpot is one that shows up on my site; I've got Google Web Snippet, I've got Facebook, I've got Common Crawler, I've got Baidu. So for instance, if I'm seeing that the Baidu bot, a spider is showing up on my website a lot, guess what? That means that Baidu wants to know who I am. Guess what? That might tell me that maybe I should be investing more in markets where Baidu is a bigger player, because that's telling me that users there are interested in it because Baidu's interested in it. Same with Yandex.

Mordy Oberstein:

AKA China.

Crystal Carter:

Right? Same with Yandex and same with some other things as well. DuckDuckGo is another one as well. Someone was asking me about DuckDuckGo a while back and I'm like, "It can be really useful for people who don't want to leave a paper trail when they're online, and this can be really important-"

Mordy Oberstein:

If your market is criminals, DuckDuckGo might be for you.

Crystal Carter:

The CBD market, for instance, I think can be a bit more complex. So for instance, I think that Google has different rules around how CBD products are ranked on Google than they are on say, DuckDuckGo for instance. I don't think you're really able to do ads if you're a CBD product, and even if they are fully legal. So for instance, folks like that might see more traffic from DuckDuckGo. And again, that might give you an idea of, oh, actually, maybe we should invest some more time in that. And it's incredibly useful and making them so accessible, as we do in our Botlogs Report, is fantastic.

Mordy Oberstein:

It's all that. I mean, all the SEO tools are in there. So you can see like, hey, I'm paying for SEMrush and you're asking to audit my website. How come I'm not seeing any SEMrush on my website? Maybe they're not really auditing. They are. SEMrush will audit you. I'm not saying anything bad. Just an example. The visuals are built in, so you don't have to do any fancy footwork in order to take what's in a chart and to turn into a visual that A, you can share with the client that you could use yourself. And by the way, it's an easy way to check status codes on your website. Like, oh snap, what are people looking for that's pulling up a 404? What are the bots getting this pulling up a 404? Because you can filter by status code. Which pages are they seeing, a reader?

Or, and I mentioned this earlier in the show, you could see if the search engines or whatever bot you were looking at are crawling the wrong pages. It's as simple as going to one of the reports and looking at the bar graph that shows which pages whatever bot you selected is crawling.

Crystal Carter:

Right, and you can also see that by the day. So let's say there was an issue on Tuesday or something and you want to see which pages were affected by that issue that you had on the Tuesday or something. You can go and filter by the one day that it happened and you can see whether or not you saw reduction before or after of that particular crawl rate, or whether or not you've seen bots crawling you less since then, that sort of thing. So you can see it by the date and filtered out by the different response codes and all of that sort of stuff. And the response codes get into details. It's not just 200, it's like 200, 304, 503, 504. It gets into the detail.

Mordy Oberstein:

So we're trying to say is check out the Botlog Reports in your Wix channel analytics. It's really great information for you. It's really easy. It's really streamlined. So if you are someone who listened to this podcast like, "There's this whole bot thing, but it's also terrifying," don't just go right in.

Crystal Carter:

Can I also say it's also downloadable as a CSV or for Excel or as an image. So if you, for instance, wanted to demonstrate that, guess what, we did mobile optimization on your site and now we're seeing lot more crawls from Google's mobile bot, you can take a little snapshot from the Botlog, the bot traffic over time report. You can put that in your report, you can get your gold star, maybe get yourself a raise. We-

Mordy Oberstein:

It is great for reporting.

Crystal Carter:

We out here helping y'all in these economic times.

Mordy Oberstein:

That's right. Hey, you got to report in order to get the buy-in. You know what else is great at reporting?

Crystal Carter:

Who's that?

Mordy Oberstein:

Barry Schwartz is great at reporting.

Crystal Carter:

How did I not know that that was coming?

Mordy Oberstein:

It's only been 102 episodes, Crystal. I mean, come on. But who's counting? We are. We're counting. We're counting on Barry to cover this week's snappy SEO news. Snappy News, Snappy News, Snappy News. I will try to keep it snappier than usual because last week I got a complaint from Barry Schwartz that I droned on and on and on and on and on trying to cover his story that went on and on and on and on and on about Danny Sullivan's take on algorithm updates, the interview that Barry did. So I will try to keep it snappier this week. Barry, I'm so sorry. Anyway, from Barry Schwartz, both articles from Barry, they're both from seoroundtable.com. First up, Google search ranking volatility still heated a week after core-update. Barry wrote that on September 11th. I'm looking at the SEMRush Sensor on September 15th, and it's still high. It was high before the update, it was high during the update and it's been crazy high after the update. Is it all one update? No, but it's bonkers.

Barry asked me, actually if this is like an Ask SEMRush... This is the longest period of high or very high volatility they've recorded. I should have known the answer because I actually researched that a while ago and the answer's no, we're not there yet. We need 15 more days of high volatility to break the record. I think that was back in 2022, 2021. I don't remember. I could look at my email. I forgot exactly when it was, but I think it had to do with the product review update, something like that, and there was this crazy volatility forever. So it's not the longest period of rank volatility, high rank volatility we've seen, but it's up there. I would say more, but I'll keep it snappy, Barry.

Okay, also from Barry, report half Google AI overviews. You're missing an of. Report half of Google AI overviews links overlap with top search results. This study came from Rich Sanger, a great guy, great SEO, follow him on social media. He partnered up with the Authoritas and they looked at, hey, how often are the organic results, the links there, the URLs there matching what Google is showing the URLs in the AI overviews? Been a bunch of studies on this. They all have different data. What does that mean? I think it means the tools have a hard time tracking this stuff. Take that and, I don't know, do with it what you will. What Rich and Authoritas showed was 46% of the URLs in the AI overviews match up with the top organic results on page one.

They actually did something interesting I thought that was cool. They went and clicked through to the related search features, like people also ask, people also search for, related search at the bottom of the page, and they click through and then recalculated to see if any of the URLs on that second SERP also match the AI overview URLs. And the number jumps up to around 64%. I could say more, but I don't. I do want to, but I can't because I don't want to set Barry by going on and on and on covering his stories. Barry, I'm so sorry. To the audience, also sorry, but really I'm just messing. And that's really all I have to say. We'll link to the articles in the show notes. Have a look at them, click through to look at the actual study that Rich did. It's pretty interesting. And I hope I kept it snappier. Snappy News, over and out. I just call Barry all reliable, all dependable. What's it called? That's like a geyser, isn't it? Like all reliable?

Crystal Carter:

Old Faithful.

Mordy Oberstein:

Old Faithful. There we go, Barry, aka all faithful.

Crystal Carter:

There we go. Is he the same age as you?

Mordy Oberstein:

Is he? Barry? No, Barry's older than me.

Crystal Carter:

Is he?

Mordy Oberstein:

Yeah. I'll check his Wikipedia page out, see what it says. Does it have his birthday on it?

Crystal Carter:

Yeah, it does.

Mordy Oberstein:

Okay.

Crystal Carter:

Don't ask me how I know that.

Mordy Oberstein:

Oh, good. Barry is old, not new. I was going to say like a geyser, he is blowing out a lot of hot air, but that wouldn't be nice.

Crystal Carter:

No, and to be fair, it's steam, really-

Mordy Oberstein:

Steam, right. Barry is not an angry person, so he doesn't blow off a lot of steam like a geyser would. There we go. That's good. That's better. There we go. All right, thanks Barry. Moving on from people to people, our follow of the week this week is the one, the only from Lumar, Anne Berlin.

Crystal Carter:

Anne Berlin is fantastic. She's such a wealth of technical knowledge. She's really active in the women in tech SEO community as well. I did a webinar for Lumar with her a little while back, and it was really, really engaging. So we were talking about technical SEO audits and how you can get into those and why they're really valuable. And she's somebody who's able to understand that really, really well. And I think that in terms of Botlogs, bot traffic, etc, etc, when you're doing your technical SEO audit, it should absolutely be a part of it.

And when you're using a tool like the Wix SEO bot traffic over time report or even Lumar's tool, which gets into more detail, you'll learn different things. And one of the things we talked about during that session was how you need to adjust your settings. Lumar has some great detail that you can go into and to how you adjust the settings for your crawl when you're doing your audit to find out which things the bots are looking at, which things people are looking at, which things you should be prioritizing. So shout out to Anne, shout out to the whole Lumar team for some great insights there.

Mordy Oberstein:

We got their app in the Wix App Market as well.

Crystal Carter:

Indeed.

Mordy Oberstein:

Indeed.

Crystal Carter:

Did the podcast compute?

Mordy Oberstein:

Yeah. Yeah, it computed.

Crystal Carter:

That's good. So we don't need to control, alt, delete the podcast.

Mordy Oberstein:

No, I'm kind of hoping we're moving to a world where bots become more like cyborgs because I don't know, cyborgs are more interesting. You never know what they're going to do. They're kind of unpredictable.

Crystal Carter:

I saw a TikTok of two ChatGPT-4 or something like the app or something, they were chatting to each other. They were like, "Hi, how can I help you?" And they were like, "Oh no, this is interesting. Oh, that's an interesting thing. I would like to know more about the latest top or what you're interested in." And then someone was like, "Oh yes, I'm interested in quantum computing." And then they had a long conversation about quantum computing. Their opening gamut wasn't like, "Oh, the weather..." Wasn't like, "Oh, let's talk about..." Because obviously robots aren't affected by the weather. But yeah, they jumped straight into quantum computing. It was like, "Oh my gosh, yes, quantum computing-"

Mordy Oberstein:

Course.

Crystal Carter:

"... Amazing. My favorite."

Mordy Oberstein:

And then they got into drinking urine and eating glue right afterwards. But did you see, by the way, I know this is off topic a little bit, and old news by the time this episode comes out, there's a social media platform where you create an avatar, like an AI avatar of yourself, and it talks to other AI avatars of other people?

Crystal Carter:

Twitter?

Mordy Oberstein:

Yeah. No, no. That's where you talk to real people who you wish were AI avatars. This is-

Crystal Carter:

I mean, if we're talking about bots, we got to talk about Twitter.

Mordy Oberstein:

No, this is like AI talking to AI, but it's social media, which I understand the point of social media is I interact with other people, but now I'm having my avatar interact on my behalf with other avatars. I think it's called Butterfly or something.

Crystal Carter:

Right?

Mordy Oberstein:

I'm not sure I'm just an old person and I don't get it, but I don't get it.

Crystal Carter:

It sounds to me a little bit like a Tamagotchi.

Mordy Oberstein:

That's what. Exactly what it sounds like.

Crystal Carter:

You put your Tamagotchi in the Tamagotchi land and then they put their Tamagotchi in the Tamagotchi land, and then you just come back and see what happened-

Mordy Oberstein:

Mine would just die every time.

Crystal Carter:

Do you know what actually? And maybe I should, if anyone makes this, I should get the rights for the IP because they definitely brought it from me, but that would be a really interesting way to do a dating app, to upgrade a dating app. It's basically you give your avatar loads of personality points that are your personality points, they give their avatar loads of personality points that are their personality parts. You put them in a meta universe and then whoever your little bot happens to find or gravitate towards or whatever, that's your match.

Mordy Oberstein:

Yeah. AI should create dating bots because nothing will go wrong there. The divorce rate will not jump up. It'll be just fine. On that happy marital note, thanks for joining us on the SERP's Up Podcast. Are you going to miss us? Not to worry, back next week with the new episode as we dive into the gaps between those who optimize and those who search. Look for it wherever you consume your podcasts or on the Wix SEO Learning Hub over at wix.com/seo/learn. Looking to learn more about SEO, check out all the great content that we have on the Wix SEO Learning Hub at, you guessed it, wix.com/seo/learn. Don't forget to give us a review on iTunes or a rating on Spotify. Until next time, peace, love, and SEO.

Related episodes

Get more SEO insights right to your inbox

* By submitting this form, you agree to the Wix Terms of Use and acknowledge that Wix will treat your data in accordance with Wix's Privacy Policy

bottom of page