Re: alta vista and virtualvin.com

Michael Van Biesbrouck (mlvanbie@undergrad.math.uwaterloo.ca)
Wed, 12 Jun 1996 12:19:26 -0400 (EDT)


> Generating robot.txt seems like something that would be appropriate
> for automated tools. Just thinking off the top of my head, it
> seems it wouldn't be too hard to create a tool that ran on one's
> server, and that checked for dynamically created links (say, by
> comparing what is in the directory tree against what an html link
> produces?), and then automatically updates the robot.txt accordingly.

Actually, at some of the sites where I administer the web servers I
actually run robots.cgi to generate robots.txt. This allowed me to
create dynamic exclusion lists. I can also log robot visitors and
give different restrictions to robots based on their origin (the
MOMspider is allowed to visit more of my site provided it is the one
that I run).

-- 
# -- Michael Van Biesbrouck,       1996 ACM Programming Contest 3rd Place Team
:b^Js/\(.*\)\(,.*\):\1\(.\)\([a-z]*\)\(.\)r\(:.*\)>\3/\4\2:\1\3\4\5r\6\5>/
s/\(.*\)\(,.*\):\1\(.\)\([a-z]*\)\(.\)l\(:.*\)\(.\)>\3/\4\2:\1\3\4\5l\6>\7\5/
s/>$/>0/^J/^halt/!bb^J#            http://csclub.uwaterloo.ca/u/mlvanbie/