Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Not Solved robots.txt
#1
Not Solved
Hello everybody,

If I want to prevent robots from accessing this dynamic URL:

http://domain.com/private.php?action=send&uid=1

Should I block them in this way:

User-Agent: *
Disallow: /private.php?action=send&uid=?$


Or in this way?:

User-Agent: *
Disallow: /private.php?action=send&uid=*?$
#2
Not Solved
You can just block whole private.php, bots have no reason to view it:
User-Agent: *
Disallow: /private.php

There is a list of more files/actions that should be blocked in Google SEO package (robots.example file or something like that).
#3
Not Solved
Thanks, I will block it, but could you tell me how can I block a specific dynamic URL?
#4
Not Solved
A crawler will never need to access any part of private.php (as you need to be logged in to use it), so you can just block the entire script:

User-agent: *
Disallow: /private.php

EDIT: Bah, was beat to the answer. Too many tabs to read...


Forum Jump:


Users browsing this thread: 1 Guest(s)