RE: nastygram from xxx.lanl.gov

Frank Wales (frank@limitless.co.uk)
Tue, 9 Jul 1996 14:50:11 +-100


In my simple-minded approach, I'd consider anything making requests
that wasn't a person to be a robot, irrespective of the type of
activity it was performing. To me, this avoids all kinds of potential
misunderstandings; if it's not a human, it obeys the robots.txt file - no
exceptions.

If there really are clearly separate classes of activity worthy of distinct
exclusion rules, then perhaps the robots.txt file should be tarted
up to cater to them. In the meantime, I think that a robots.txt file
that says: "Robots not welcome here" should be honoured by all
non-humans, unless specific permission has been obtained.

It's always easy to assume that what your program does is an
exception, or not that big a deal, but that's still jumping to a
conclusion about someone else's motives in excluding robots,
and indeed what they consider to be a robot. If I had such a
blanket exclusion file up, and an automatic process ignored it
without getting my permission first, I'd consider its author to
have bad manners no matter what the program did.

--
Frank Wales [frank@limitless.co.uk]