MyBB Community Forums

Full Version: robots.txt problem
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hi all
I've a wordpress site in root domain. Now, i've added a forum in subfolder as mydomain/forum which makes a sitemap-created by Google SEO plugin- as follows: mydomain/forum/sitemap_index.xml. Submitting that sitemap to google, It sounds google can't access sub-sitemaps with the message of "Url blocked by robots.txt" - Value: mydomain/forum/sitemap-forums.xml?page=1 --- Value: mydoamin/forum/sitemap-index.xml?page=1.

This is my robots.txt:

User-agent: *
Disallow: /cgi-bin
Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /wp-content/plugins
Disallow: /wp-content/cache
Disallow: /wp-content/themes
Disallow: /trackback
Disallow: /feed
Disallow: /comments
Disallow: /category/*/*
Disallow: */trackback
Disallow: */feed
Disallow: */comments
Disallow: /*?*
Disallow: /*?
Allow: /wp-content/uploads


# Google Image
User-agent: Googlebot-Image
Disallow:
Allow: /*

Sitemap: mydomain/sitemap_index.xml
Sitemap: mydomain/forum/sitemap_index.xml

What should i add to robots.txt? Any help would be greatly appreciated. Thanks in advance
Try adding the following to your robots.txt

User-agent: *
Allow: /forum/*
(2013-08-12, 07:29 AM)ksr Wrote: [ -> ]Try adding the following to your robots.txt

User-agent: *
Allow: /forum/*
Thanks.
But it still gives me the same error when i test sitemap in Webmaster Tools
Is it showing freshly added rules in robots.txt at your Webmaster tools account. Google will take some time to re-index and download the robots.txt file. Wait for some time to Google download it!!!
Disallowing *?* is asking for trouble.
Thanks
It sounds, as ksr suggests, i have to wait
I commented line mentioned by frostchutz but again when i test the sitemap, i get similar errors.
So, I WAIT.
There's a robots.txt tester in Google Webmaster Tools itself, and it tells you exactly which rule allows or disallows something. If you have problems with robots.txt, that's where you should go and test extensively the URLs you want allowed / disallowed.