Many people have URLs on their public website that they want to be kept secret from web spiders, this can be achieved by using a robots.txt file to disallow access to the URL. However this makes the once secret URL known to all, a simple request for the robots.txt file will reveal to anyone all your once secret URLs that you didn't want anyone to know about.

robots.txt is a PHP script released under the GPL which can fool the spiders.


blog comments powered by Disqus