Brace yourself: The benefit and shock of analyzing Googlebot crawl spikes via log files [Case Study]

I recently started helping a site that was negatively impacted by the May 17 algorithm update. The site had been surfing the gray area of quality for a long time, surging with some quality updates, and sometimes dropping. So I started digging in via a crawl analysis and audit of the site.

Once I started analyzing the site, I noticed several strange spikes in pages crawled in the Crawl Stats report in Google Search Console (GSC). For example, Google would typically crawl about 3,000 pages per day, but the first two spikes jumped to nearly 20,000. Then two more topped 11,000.

Needless to say, I was interested in finding out why those spikes occurred. Were there technical SEO problems on the site? Was there an external factor causing the spike? Or was this a Googlebot anomaly? I quickly reached out to my client about what I was seeing.

It's only fair to share...Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInPin on PinterestShare on RedditShare on StumbleUponDigg thisShare on TumblrPrint this page

Leave a Reply

Your email address will not be published. Required fields are marked *