I read on a Dutch forum that searchspider like Googlebot take a lot of bandwidth because they search the whole forum word by word. Is this true and what to do??
They do crawl through your website/forum pages but if you have a new or a less popular site (as in search engine ranking), the bandwidth usage won't be too high. Check your analytics logs to see which host/browser agent is taking most of the bandwidth.
Or just block out GoogleBot:
http://www.google.com/support/webmasters...swer=40364
The best way to reduce bandwidth usage is to block your /images directory in your robots.txt
Also have a look at MyBB's
robots.txt. They've blocked out pages that bots can't use and would just be wasting bandwidth if they did.
Where do I find this robot.txt file it is not in my directory
I only want taht this robots will read the name of the topics, NOT WHAT IS WRITTEN IN THE TOPICS.
Is that possible.
It is but google might consider that cloaking and ban your site from the index.
Sorry to be slightly off topic, but what does this "cloaking" mean? Thanks