nastygram from xxx.lanl.gov

Peter.Vogt@tokai.sprint.com
Tue, 9 Jul 1996 04:56:00 -0400


I just got a nastygram from the web admin at xxx.lanl.gov accusing
my "robot" of "attacking" him. This "attack" consisted of HEAD's on
459 URL's, with a mean pause of about 2 minutes. The total data set
was (all sites) 653k URLs, and yes I probably should have filtered the test set
to
limit the number of accesses to any one site. Mea culpa.

He accused my "robot" of violating the "robot guidelines". He didn't
enumerate which I violated. I'm guessing he may have been upset that the
test was ignoring his robot.txt, but since the test wasn't traversing the
general web space and was in no danger of looping or getting lost, there wasn't
much point. The test was operating from a fixed list of URLs that once tested
would be discarded.

He also informed me that "we have no need for you to 'index' our site" only
to then rebuke me for running "a particularly stupid robots that only does
pointless HEADs". I didn't point out that the two would naturally be mutually
exclusive. :)

Anyway, he naturally asked that I "cease and desist", which I'm of course I'm
happy to do.

Comments? Should such accesses as mine also test robots.txt? Were my accesses
"burdensome" at that rate? Are the "robot guidelines" no longer "guidelines"
but "rules" and are these rules applicable to all forms of automated access, ev
en if
they aren't robots?

--
nabil@i.net
http://www.i.net/aaron/