Some websites, hosted on some CDNs (content material supply networks), are experiencing an enormous spike in server response instances for crawling, whereas seeing a drop in whole crawl requests. So technically, the crawling has dropped however Google is taking for much longer to crawl rather a lot much less. Supposedly, this began earlier this month and remains to be a problem for some.
This was found by Gianna Brachetti-Truskawa who posted extra about this each on LinkedIn and Bluesky and he or she wrote:
Have you ever seen a latest drop in new customers, and/or discovered that Google’s crawl fee has dropped in your website whereas server response instances appear to be increased than typical?
Google have quietly up to date their listing of IP ranges used for crawling (as of 04.02.2025). In case your web site is delivered by way of a CDN, their WAF defending your website from DDoS assaults might need Googlebot run into fee limiting or be blocked now – except they up to date their allowed IP ranges accordingly.
This didn’t have an effect on each CDN, actually, CloudFlare dealt with it positive, she mentioned. However not all CDNs dealt with it. “Fortunately, Cloudflare appears to be on high of it! However we discovered experiences of some web sites delivered by way of different CDNs, together with bigger ones like Akamai Applied sciences, who run into the difficulty, suggesting that their CDN suppliers may not have up to date their IP ranges for Googlebot but,” she wrote.
Here’s a chart from a Google Webmaster Help Forum thread exhibiting the difficulty. You may take a look at your crawl stats in Search Console over here:
Again in 2021, Google started publishing its Googlebot IP list and I covered some of the instances Google up to date that IP listing (then I finished, it wasn’t thrilling – till now).
John Mueller from Google replied to the considerations on Blueksy mainly explaining there’s this JSON file to trace these modifications and the crawling will cool down over time. He wrote:
We push the IP json information mechanically — modifications occur every now and then. If you want to alert internally on these information, be at liberty to ballot them. I checked the final three updates, they had been every 2x IP blocks added (ipv6/v4). It is usually not a whole revamp.
It is exhausting to understand how the online will react to delicate infrastructure shifts, which is a part of why we have been publishing these IP ranges mechanically. Hopefully it was only a short-term blip!
I observe these modifications nonetheless and usually, the modifications aren’t that frequent and sometimes fairly minor to the general dimension of the doc. However modifications are modifications – listed below are a few of the newer modifications that I tracked:
You may see the JSON file here.
Gianna Brachetti-Truskawa shared some tips about what you are able to do, in case you are impacted – she wrote:
- Test along with your CDN supplier in the event that they’ve up to date their IP ranges for Googlebot. You may ask them to confirm utilizing Google’s JSON file. If not, think about switching to a supplier that retains up with these modifications.
- Contemplate monitoring modifications your self, or discover snapshots of the file within the Wayback Machine. It’s also possible to save snapshots there on demand by your self (I might not counsel to depend on infrastructure you do not personal however it’s one straightforward approach!) after which examine the 2 information along with your favorite technique (eg. utilizing Testomato or Little Warden – or a Examine plugin in Notepad++ when you’re feeling old-school).
- Discover extra recommendation about CDNs within the feedback.
Would you like me to cowl the modifications to this JSON file going ahead? Wouldn’t it be useful to you?
Discussion board dialogue at LinkedIn and Bluesky.
Replace: There may be now additionally a WebmasterWorld thread complaining about the identical factor – here’s a comparable chart from there:



