2019-09-03, 11:24 AM
(2019-09-03, 10:36 AM)noyle Wrote:(2019-09-03, 08:18 AM)Crazycat Wrote: The only solution I can see is to exclude profile links one by one.
Peharps having a task which will generate the robots.txt and exclude all users profile links could be the simpliest way.
Only do it with arobots.txt
file is not enough, at least for Google, for not just future content links but links already existing in Google search results.
Read this help document to see how Google will index if arobots.txt
file exists:
Quote:You should not use robots.txt as a means to hide your web pages from Google Search results. This is because, if other pages point to your page with descriptive text, your page could still be indexed without visiting the page. If you want to block your page from search results, use another method such as password protection or a noindex directive.
OP also would like to read carefully about the Google Robots FAQs page, especially the If I block Google from crawling a page using a robots.txt disallow directive, will it disappear from search results? and How long will it take for changes in my robots.txt file to affect my search results? section.
I noticed WordPress made a decision to remove theDisallow: /
directive inrobots.txt
file from WordPress 5.3, announced recently.
In addition, OP may create an extra user group for Spiders/Bots visiting to limit what they can see. To assign a different user group other than Guests or review current settings to bots, go toAdminCP > Configuration > Spiders / Bots
.
So if I were to place in a new group, making it only view the forum/threads and not profiles would suffice?
Thanks