Host load is a factor in Googlebot’s crawl scheduling that describes the strain being placed on a web server by incoming crawl requests. Googlebot monitors response latency, connection timeouts, and HTTP error rates during a crawl session and automatically reduces its crawl rate if the server appears to be struggling — a behaviour sometimes referred to as polite crawling. High host load can result from underpowered hosting, database bottlenecks on dynamically generated pages, or insufficient caching, and the knock-on effect is that Googlebot fetches fewer pages per day, effectively shrinking the crawl budget consumed. Conversely, a well-optimised server that responds quickly and consistently can sustain a higher crawl rate. Server-side logging of Googlebot requests is the most direct way to observe how host load is affecting crawl throughput in practice.