thanks for the explanation of your software.
>I think this is just another example of why robots.txt isn't a good
>scalable solution; Rob --- do I recall correctly that Apache will add
>a "deny from user-agent" thing in .htaccess files? It seems that's a
>much better way of achieving the same goals than robots.txt
>modifications. Also solves the robots.txt creation AFTER the robot
>came problem.
The current development Apache 1.2 has this, but the syntax may well
change before 1.2 beta/final.
It would be much more useful if I could identify all robots/crawler/whatever
with one unique string instead of using a long list of names to be blocked.
rob