MyBB Community Forums

Full Version: Google Sitemap shows user profiles
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2
How come my Google sitemap is showing user profiles when users aren't part of the sitemap? I excluded them via settings, but I believe there may need further adjustments to get user profiles to not display on my google sitemap

I am using the google SEO plugin

Sitemap Settings:
https://i.gyazo.com/6a93f1b1c4e6fe98f1bd...8b152f.png
Bump bump
Did you delete the generated sitemap after having change your settings ?
(2019-09-03, 06:09 AM)Crazycat Wrote: [ -> ]Did you delete the generated sitemap after having change your settings ?

Yes,

Curently my search map on google search console is shown as:
[Image: f936a8810796dd7aef0f29bafe853125.png]
What is the url of your sitemap ? Does the file contain the "bad" links ? Don't trust the google console, trust the file first.
(2019-09-03, 07:06 AM)Crazycat Wrote: [ -> ]What is the url of your sitemap ? Does the file contain the "bad" links ? Don't trust the google console, trust the file first.

Site map seems to be just fine. I have PMd you it
Yes, it's fine, there is no link to user pages. The trouble is with Google and the way it indexes the pages. As you don't prefix the links to the user profiles, you can't exclude them with robots.txt.
(2019-09-03, 07:48 AM)Crazycat Wrote: [ -> ]Yes, it's fine, there is no link to user pages. The trouble is with Google and the way it indexes the pages. As you don't prefix the links to the user profiles, you can't exclude them with robots.txt.

Ah I see. So what would you suggest I do without adding a prefix to the users profiles?
The only solution I can see is to exclude profile links one by one.
Peharps having a task which will generate the robots.txt and exclude all users profile links could be the simpliest way.
(2019-09-03, 08:18 AM)Crazycat Wrote: [ -> ]The only solution I can see is to exclude profile links one by one.
Peharps having a task which will generate the robots.txt and exclude all users profile links could be the simpliest way.

Only do it with a robots.txt file is not enough, at least for Google, for not just future content links but links already existing in Google search results.

Read this help document to see how Google will index if a robots.txt file exists:
Quote:You should not use robots.txt as a means to hide your web pages from Google Search results. This is because, if other pages point to your page with descriptive text, your page could still be indexed without visiting the page. If you want to block your page from search results, use another method such as password protection or a noindex directive.

OP also would like to read carefully about the Google Robots FAQs page, especially the If I block Google from crawling a page using a robots.txt disallow directive, will it disappear from search results? and How long will it take for changes in my robots.txt file to affect my search results? section.

I noticed WordPress made a decision to remove the Disallow: / directive in robots.txt file from WordPress 5.3, announced recently.

In addition, OP may create an extra user group for Spiders/Bots visiting to limit what they can see. To assign a different user group other than Guests or review current settings to bots, go to AdminCP > Configuration > Spiders / Bots.
Pages: 1 2