Actually, at some of the sites where I administer the web servers I
actually run robots.cgi to generate robots.txt. This allowed me to
create dynamic exclusion lists. I can also log robot visitors and
give different restrictions to robots based on their origin (the
MOMspider is allowed to visit more of my site provided it is the one
that I run).
-- # -- Michael Van Biesbrouck, 1996 ACM Programming Contest 3rd Place Team :b^Js/\(.*\)\(,.*\):\1\(.\)\([a-z]*\)\(.\)r\(:.*\)>\3/\4\2:\1\3\4\5r\6\5>/ s/\(.*\)\(,.*\):\1\(.\)\([a-z]*\)\(.\)l\(:.*\)\(.\)>\3/\4\2:\1\3\4\5l\6>\7\5/ s/>$/>0/^J/^halt/!bb^J# http://csclub.uwaterloo.ca/u/mlvanbie/