Well for now, what I have done is:
installed the plugin and
Activated:
404, Meta and Sitemap
and deactivated
Redirect and URL
and have set MyBB friendly urls + robots.txt uploaded
Due to this I did not have to change the functions.php (the core file) (though you have already provided the ditied one, still I am unwilling to edit the core file at the moment. )
will this give a boos to my board or not or I have to enable urls also??
at what URL can i find my google sitemap after i install this plugin?
my forum is in the root folder ( / )
so, it would be www . mydomain . com / ???
thanks
i have done all things right but once in gpoogle webmaster's tools it showed correct sitemap and now showing error, plz tell why??
You can change the code in the Google SEO language file. Replace the Google 404 widget code with the custom search code. (Not sure if the Google 404 widget actually has a custom search option).
Hello,
I install google seo, all is right : no 404 errors, i updated my htaccess and function.php...
but now i have a problem with sitemap :
Thak for help and sorry for my English
Most of these are dated 10. October, maybe a temporary problem with the site?
Try Labs -> Fetch as GoogleBot (Labos -> Analyser comme Googlebot) and give it a couple of these URLs and see what you get... as long as it gives a positive result it's okay.
In general, 404 errors are to be expected for:
- threads that are linked to, but guests have no permission to actually read
- action=nextnewest if it's currently the newest thread
- action=nextoldest for the oldest thread
(that's why I think nextnewest/nextoldest are rather pointless to Google)
- threads that got moderated or deleted in the meantime
- page=x if a post got deleted so the page does not exist anymore
etc.
Thank you for answer, i will try this solution and give a feedback.
(2010-10-18, 02:35 PM)frostschutz Wrote: [ -> ]Most of these are dated 10. October, maybe a temporary problem with the site?
Try Labs -> Fetch as GoogleBot (Labos -> Analyser comme Googlebot) and give it a couple of these URLs and see what you get... as long as it gives a positive result it's okay.
In general, 404 errors are to be expected for:
- threads that are linked to, but guests have no permission to actually read
- action=nextnewest if it's currently the newest thread
- action=nextoldest for the oldest thread
(that's why I think nextnewest/nextoldest are rather pointless to Google)
- threads that got moderated or deleted in the meantime
- page=x if a post got deleted so the page does not exist anymore
etc.
How would we prevent Google from even crawling there? Because doesn't that look bad in our part and such? It seems to me Google is disregarding the robots.txt file =/.
Google Webmaster Tools also has a robots.txt test somewhere so you can check whether specific URLs are allowed or disallowed. If you're not sure about robots.txt changes you've made, testing a couple of URLs is always a good idea to see whether it's accepting only the URLs you want.