Monday, December 23, 2024

Google Confirms 3 Ways To Make Googlebot Crawl More

Must read

Google’s Gary Illyes and Lizzi Sassman discussed three factors that trigger increased Googlebot crawling. While they downplayed the need for constant crawling, they acknowledged there a ways to encourage Googlebot to revisit a website.

1. Impact of High-Quality Content on Crawling Frequency

One of the things they talked about was the quality of a website. A lot of people suffer from the discovered not indexed issue and that’s sometimes caused by certain SEO practices that people have learned and believe are a good practice. I’ve been doing SEO for 25 years and one thing that’s always stayed the same is that industry defined best practices are generally years behind what Google is doing. Yet, it’s hard to see what’s wrong if a person is convinced that they’re doing everything right.

Gary Illyes shared a reason for an elevated crawl frequency at the 4:42 minute mark, explaining that one of triggers for a high level of crawling is signals of high quality that Google’s algorithms detect.

Gary said it at the 4:42 minute mark:

“…generally if the content of a site is of high quality and it’s helpful and people like it in general, then Googlebot–well, Google–tends to crawl more from that site…”

There’s a lot of nuance to the above statement that’s missing, like what are the signals of high quality and helpfulness that will trigger Google to decide to crawl more frequently?

Well, Google never says. But we can speculate and the following are some of my educated guesses.

We know that there are patents about branded search that count branded searches made by users as implied links. Some people think that “implied links” are brand mentions, but “brand mentions” are absolutely not what the patent talks about.

Then there’s the Navboost patent that’s been around since 2004. Some people equate the Navboost patent with clicks but if you read the actual patent from 2004 you’ll see that it never mentions click through rates (CTR). It talks about user interaction signals. Clicks was a topic of intense research in the early 2000s but if you read the research papers and the patents it’s easy to understand what I mean when it’s not so simple as “monkey clicks the website in the SERPs, Google ranks it higher, monkey gets banana.”

In general, I think that signals that indicate people perceive a site as helpful, I think that can help a website rank better. And sometimes that can be giving people what they expect to see, giving people what they expect to see.

Site owners will tell me that Google is ranking garbage and when I take a look I can see what they mean, the sites are kind of garbagey. But on the other hand the content is giving people what they want because they don’t really know how to tell the difference between what they expect to see and actual good quality content (I call that the Froot Loops algorithm).

What’s the Froot Loops algorithm? It’s an effect from Google’s reliance on user satisfaction signals to judge whether their search results are making users happy. Here’s what I previously published about Google’s Froot Loops algorithm:

“Ever walk down a supermarket cereal aisle and note how many sugar-laden kinds of cereal line the shelves? That’s user satisfaction in action. People expect to see sugar bomb cereals in their cereal aisle and supermarkets satisfy that user intent.

I often look at the Froot Loops on the cereal aisle and think, “Who eats that stuff?” Apparently, a lot of people do, that’s why the box is on the supermarket shelf – because people expect to see it there.

Google is doing the same thing as the supermarket. Google is showing the results that are most likely to satisfy users, just like that cereal aisle.”

An example of a garbagey site that satisfies users is a popular recipe site (that I won’t name) that publishes easy to cook recipes that are inauthentic and uses shortcuts like cream of mushroom soup out of the can as an ingredient. I’m fairly experienced in the kitchen and those recipes make me cringe. But people I know love that site because they really don’t know better, they just want an easy recipe.

What the helpfulness conversation is really about is understanding the online audience and giving them what they want, which is different from giving them what they should want. Understanding what people want and giving it to them is, in my opinion, what searchers will find helpful and ring Google’s helpfulness signal bells.

2. Increased Publishing Activity

Another thing that Illyes and Sassman said could trigger Googlebot to crawl more is an increased frequency of publishing, like if a site suddenly increased the amount of pages it is publishing. But Illyes said that in the context of a hacked site that all of a sudden started publishing more web pages. A hacked site that’s publishing a lot of pages would cause Googlebot to crawl more.

If we zoom out to examine that statement from the perspective of the forest then it’s pretty evident that he’s implying that an increase in publication activity may trigger an increase in crawl activity. It’s not that the site was hacked that is causing Googlebot to crawl more, it’s the increase in publishing that’s causing it.

Here is where Gary cites a burst of publishing activity as a Googlebot trigger:

“…but it can also mean that, I don’t know, the site was hacked. And then there’s a bunch of new URLs that Googlebot gets excited about, and then it goes out and then it’s crawling like crazy.”​

A lot of new pages makes Googlebot get excited and crawl a site “like crazy” is the takeaway there. No further elaboration is needed, let’s move on.

3. Consistency Of Content Quality

Gary Illyes goes on to mention that Google may reconsider the overall site quality and that may cause a drop in crawl frequency.

Here’s what Gary said:

“…if we are not crawling much or we are gradually slowing down with crawling, that might be a sign of low-quality content or that we rethought the quality of the site.”

What does Gary mean when he says that Google “rethought the quality of the site?” My take on it is that sometimes the overall site quality of a site can go down if there’s parts of the site that aren’t to the same standard as the original site quality. In my opinion, based on things I’ve seen over the years, at some point the low quality content may begin to outweigh the good content and drag the rest of the site down with it.

When people come to me saying that they have a “content cannibalism” issue, when I take a look at it, what they’re really suffering from is a low quality content issue in another part of the site.

Lizzi Sassman goes on to ask at around the 6 minute mark if there’s an impact if the website content was static, neither improving or getting worse, but simply not changing. Gary resisted giving an answer, simply saying that Googlebot returns to check on the site to see if it has changed and says that “probably” Googlebot might slow down the crawling if there is no changes but qualified that statement by saying that he didn’t know.

Something that went unsaid but is related to the Consistency of Content Quality is that sometimes the topic changes and if the content is static then it may automatically lose relevance and begin to lose rankings. So it’s a good idea to do a regular Content Audit to see if the topic has changed and if so to update the content so that it continues to be relevant to users, readers and consumers when they have conversations about a topic.

Three Ways To Improve Relations With Googlebot

As Gary and Lizzi made clear, it’s not really about poking Googlebot to get it to come around just for the sake of getting it to crawl. The point is to think about your content and its relationship to the users.

1. Is the content high quality?
Does the content address a topic or does it address a keyword? Sites that use a keyword-based content strategy are the ones that I see suffering in the 2024 core algorithm updates. Strategies that are based on topics tend to produce better content and sailed through the algorithm updates.

2. Increased Publishing Activity
An increase in publishing activity can cause Googlebot to come around more often. Regardless of whether it’s because a site is hacked or a site is putting more vigor into their content publishing strategy, a regular content publishing schedule is a good thing and has always been a good thing. There is no “set it and forget it” when it comes to content publishing.

3. Consistency Of Content Quality
Content quality, topicality, and relevance to users over time is an important consideration and will assure that Googlebot will continue to come around to say hello. A drop in any of those factors (quality, topicality, and relevance) could affect Googlebot crawling which itself is a symptom of the more importat factor, which is how Google’s algorithm itself regards the content.

Listen to the Google Search Off The Record Podcast beginning at about the 4 minute mark:

Featured Image by Shutterstock/Cast Of Thousands

Latest article