Crawl rate limit refers to the maximum speed at which Googlebot fetches pages from a site, measured in requests per second or parallel connections. Google’s crawl scheduler automatically calibrates this rate based on the server’s response times and error rates — if a site responds slowly or with errors, Googlebot backs off to avoid overloading the host. Webmasters could previously request a lower crawl rate via Google Search Console’s crawl rate settings, but Google removed that manual control from Search Console in January 2024, stating that its automatic systems had become accurate enough to make manual overrides unnecessary. The crawl rate is distinct from crawl budget, which governs how many pages from a site Google considers worth crawling over time. Serving fast, reliable responses consistently remains the most effective way to encourage a higher crawl rate without configuration.