Google Slows Crawling With 5xx Server Errors

When a website starts to serve Googlebot 5xx server errors, how does Googlebot handle the change in server status?  The question came up about Google crawl errors, and John Mueller from Google had some interesting comments about 5xx type of server errors and how it impacts Googlebot.

So Googlebot will slow crawling once it begins to see 5xx server errors being returned.

For many sites, server errors tend to be for only a short period of time, such as a host being down, or a CMS update being pushed, and there would not any long term issues.  This happens to many websites from time to time.

For other sites, serving server errors can be a chronic issue and are a sign something is going on with the site.  Sometimes a site is serving just Googlebot 5xx server errors due to a misconfigured site firewall for example.  Or a site blocks traffic from certain countries, including ones Googlebot can crawl from, due to errors implementing geotargeting.

Regardless, if Search Console shows server errors, you should investigate them to make sure it was only a temporary problem and not a bigger issue. If Search COnsole continues to show errors, you should do some fetch and renders, as that will show exactly what Googlebot is seeing, and can help you investigate the errors if there doesn’t seem to be a problem when you visit the site.

Unless the server errors were a long term issue, Google should continue to visit the site to see if the server errors have been resolved and return to crawling as usual.  The crawl rate should go back to what the site usually sees after Googlebot is no longer seeing the errors.


The following two tabs change content below.

My Twitter profileMy Facebook profileMy Google+ profileMy LinkedIn profileMy Twitter profileMy Facebook profileMy Google+ profileMy LinkedIn profile