Google’s John Mueller stated that for the reason that robots.txt file is cached by Google for about 24-hours, it doesn’t make a lot sense to dynamically replace your robots.txt file all through the day to manage controlling.
Google will not essentially see that you don’t need Google to crawl a web page at 7am after which at 9am you do need Google to crawl that web page.
John Mueller wrote on Bluesky in response to this post:
QUESTION:
One in all our technicians requested if they may add a robots.txt file within the morning to dam Googlebot and one other one within the afternoon to permit it to crawl, as the web site is intensive they usually thought it would overload the server. Do you suppose this might be observe?
(Clearly, the crawl price of Googlebot adapts to how effectively the server responds, however I discovered it an attention-grabbing query to ask you) Thanks!
ANSWER:
It is a unhealthy thought as a result of robots.txt will be cached as much as 24 hours ( developers.google.com/search/docs/… ). We do not advocate dynamically altering your robots.txt file like this over the course of a day. Use 503/429 when crawling is an excessive amount of as a substitute.
This isn’t new information, we lined this a decade in the past below Google: Don’t Make A Dynamically Generated robots.txt. We additionally knew the 24-hour factor again in 2010.
Discussion board dialogue at Bluesky.