MyBB Community Forums

Full Version: GoogleBots crushing my site?
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
So, my host believes GoogleBot is maxing out the memory on my server (I have a decent package too). Here is what they wrote: "I see that Google bot hit your site a few thousand times in the last couple of hours. It's likely that the crawling caused your massive load. I would recommend setting up Web Master Tools to delay the amount of crawling that Google bots will do on your websites."

Has anyone experienced this? Any way to limit the amount of crawling per hour?

Thank you
Get another host ASAP.
If it can't even handle a Google bot, it'll never handle visitors.
This is what I have in the package:

60 GB Storage
1 TB Monthly Transfer
768 MB Guaranteed RAM
1.5 GB Burst RAM
Hardware RAID 10
http://google.com/webmasters

You can limit Google bot's crawl rate there.
You can as well consider using a robots.txt to block Google crawling unnecessary pages (an example robots.txt is provided in the Google SEO plugin package).
Thanks all, this should help.