A very very common issue (affecting about 1 in 50 users) is that some customers use plugins for Wordpress that load remote content into their own blog. This creates about 20-30.000 pages. Now the issue is that these blogs are then being crawled to death by search engine bots. Each access is one PHP and MySQL execution and this happens 20-30.000 times within a few minutes.
Does anyone have an idea how to handle this as a host ? To just suspend the site seems kind of unfair. Afterall the customer has done nothing illegal. And he has not overloaded the server either. Rather the crawlers are doing a ****** job by massively overloading the server. Is there some way to limit the crawler's access rate or something ?