Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Not Solved robots.txt
Not Solved
Hello everybody,

If I want to prevent robots from accessing this dynamic URL:

Should I block them in this way:

User-Agent: *
Disallow: /private.php?action=send&uid=?$

Or in this way?:

User-Agent: *
Disallow: /private.php?action=send&uid=*?$
Not Solved
You can just block whole private.php, bots have no reason to view it:
User-Agent: *
Disallow: /private.php

There is a list of more files/actions that should be blocked in Google SEO package (robots.example file or something like that).
Not Solved
Thanks, I will block it, but could you tell me how can I block a specific dynamic URL?
Not Solved
A crawler will never need to access any part of private.php (as you need to be logged in to use it), so you can just block the entire script:

User-agent: *
Disallow: /private.php

EDIT: Bah, was beat to the answer. Too many tabs to read...

Forum Jump:

Users browsing this thread: 1 Guest(s)