MyBB Community Forums
[For 1.6] Google SEO 1.6.8 [EOL] - Printable Version

+- MyBB Community Forums (https://community.mybb.com)
+-- Forum: Resources (https://community.mybb.com/forum-8.html)
+--- Forum: Plugins (https://community.mybb.com/forum-73.html)
+---- Forum: Plugin Releases (https://community.mybb.com/forum-102.html)
+---- Thread: [For 1.6] Google SEO 1.6.8 [EOL] (/thread-101262.html)



RE: Google SEO 1.6.4 - frostschutz - 11-18-2012

I don't know about "attractive" but it does show a 404 error page in the MyBB design (like any other MyBB error message), with the Google 404 widget included, which may suggest similar URLs and offers a search field.

Of course this is an option. Pretty much all features of this plugin are options.


RE: Google SEO 1.6.4 - geekywebadmin - 11-18-2012

(11-18-2012, 02:31 PM)frostschutz Wrote: I don't know about "attractive" but it does show a 404 error page in the MyBB design (like any other MyBB error message), with the Google 404 widget included, which may suggest similar URLs and offers a search field.

Of course this is an option. Pretty much all features of this plugin are options.

Oh, I see. I was unclear how that particular option works. I do have the 404 Error Google Widget enabled. What I don't understand is the following:

If I enter the URL: FORUMROOT/Forum-Nonexistent/ - I am presented with a MyBB style 404 page.

However, if I enter the URL: FORUMROOT/blahblahblah.html - I am presented with a default 404 page from my Linux server.

Is there a way to create my own custom page that would work in both instances? I didn't know if that was part of this plugin or not because I was unclear was the Google 404 Widget really did.


RE: Google SEO 1.6.4 - frostschutz - 11-18-2012

It's the server's decision what to show as 404 error, with Apache this is the ErrorDocument directive, which you can put in your .htaccess file. If it's not showing, that's a server configuration issue, not something I can fix in my plugin.


RE: Google SEO 1.6.4 - geekywebadmin - 11-18-2012

(11-18-2012, 02:48 PM)frostschutz Wrote: It's the server's decision what to show as 404 error, with Apache this is the ErrorDocument directive, which you can put in your .htaccess file. If it's not showing, that's a server configuration issue, not something I can fix in my plugin.

That answers my question. I was unclear as to whether this was within the scope of your plugin or not.

Although it's not a plugin issue. If someone using your plugin wanted to create a custom 404 page (using that method your just referenced), would you have any suggestions or precautions so it wouldn't interfere with GoogleSEO?

If not, I'll move alone and leave you be! Big Grin


RE: Google SEO 1.6.4 - NNT_ - 11-23-2012

Hi, please check this topic please : http://community.mybb.com/thread-129666-post-941199.html#pid941199

I've downgraded to 1.6.3 but the sitemaps still being invalid Sad


RE: Google SEO 1.6.4 - The-Best - 12-02-2012

I do what ever you have put here but still I got this problem

URL is passive. Apply changes to core files to activate.


RE: Google SEO 1.6.4 - frostschutz - 12-02-2012

click apply?


RE: Google SEO 1.6.4 - The-Best - 12-02-2012

its ok I fixed it now thanx alot for this nice plugin


RE: Google SEO 1.6.4 - Leefish - 12-02-2012

I am using Google SEO for the first time and I see I am getting duplicate pages in Google because "action=lastpost" and "action=newpost" is affixed on the end of the urls. How can I prevent that happening? In my robots.txt I have those set to disallow.


RE: Google SEO 1.6.4 - frostschutz - 12-02-2012

You can check robots.txt in the Webmaster Tools. Put the URL in question there and it will tell you if Googlebot interprets it as being allowed or not (and which rule allows or disallows it). If it's disallowed, Google shouldn't be indexing it. If pages are indexed from a time before you disallowed them in robots.txt, you can issue an URL removal request specific to the subset of URLs (same rules as in robots.txt) you want removed. I'd be a bit careful with those removal requests, though.

Personally, I don't disallow action=lastpost. With the canonical tag provided by Google SEO Meta, Google merges those pages to the canonical name quickly. When doing a site search (site:www.japanisch-netzwerk.de inurl:action=lastpost) it currently finds only one thread page, which has a post written today. In a day or two this should no longer appear but instead be picked up under its canonical name.

Google also lists links which are outright blocked by robots.txt - for example if you have search.php blocked and you do a site search (site:yoursite/search.php) it might list hundreds of ?action=finduser&uid=123. And the description says "A description for this result is not available because of this site's robots.txt – learn more".

That's a Google thing - they like to list URLs even if they are not allowed to crawl them, if links to those URLs appear on other indexed pages. You could get rid of them with an URL removal request, or by letting Google crawl those pages and specifically serve them with a noindex header

But neither of those solutions are really preferrable...