MyBB Community Forums

Full Version: Bot User Group
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2
I want to keep bots like GoogleBot out of a forum because it contains trash I don't want to delete for statistics reasons but I don't want to keep visible for AdSense reasons. Which usergroup do the bots belong to so I can do this? I've got Guests and Registered denied view capabilities, will this be enough?
If guests are denied, bots will be denied too.
Hi

Create a new group but use this as the Username Style:
Quote:<br><{username}><br>
Or just
Quote:<{username}>
To hide it completely

This wont display the bots as there will only be a line of open text in place of the bots name, I will search for another way tonight

Regards
Malietjie
(2010-10-21, 04:25 PM)AJS Wrote: [ -> ]If guests are denied, bots will be denied too.

Thanks, this is what I needed to know.
By default Bots belong to the Guests / Unregistered group.
IMHO it's best to create a bots group for the greatest degree of control. You can then set permissions for the group without worrying about guests or registered members.
In this particular case it doesn't matter to me as only staff should be allowed to see this forum, but I'll keep it in mind.
(2010-10-22, 03:17 AM)labrocca Wrote: [ -> ]IMHO it's best to create a bots group for the greatest degree of control. You can then set permissions for the group without worrying about guests or registered members.

how do you make a bots group that gets the bots assigned to it? i have not looked into the spider/bot code to know if a plugin can assign a usergroup at that point they are found to be spiders/bots
You could also try a robots.txt file, and put this in it:

User-agent: *
Disallow: /
(2010-10-22, 05:20 PM)Dutchcoffee Wrote: [ -> ]You could also try a robots.txt file, and put this in it:

User-agent: *
Disallow: /

Doesn't work so well if I want to let bots into some forums but not others, and on top of that some bots don't care about robots.txt.
Pages: 1 2