> I just got a nastygram from the web admin at xxx.lanl.gov accusing
> my "robot" of "attacking" him. This "attack" consisted of HEAD's on
> 459 URL's, with a mean pause of about 2 minutes. The total data set
Well, thats hardly attacking in my mind. And all you were asking for
were HEAD's - hardly going to overload their system are you?
> He accused my "robot" of violating the "robot guidelines". He didn't
> enumerate which I violated. I'm guessing he may have been upset that the
> test was ignoring his robot.txt, but since the test wasn't traversing the
Reading robots.txt is a good idea, but by the sounds of your "robot"
probably not very practical. Checking validity of document URL's is
hardly the function of a true robot.
> He also informed me that "we have no need for you to 'index' our site" only
> to then rebuke me for running "a particularly stupid robots that only does
> pointless HEADs". I didn't point out that the two would naturally be mutually
> exclusive. :)
He he - probably some Civil Servant who has an infiriority (sp?)
complex :)
> Comments? Should such accesses as mine also test robots.txt? Were my accesses
> "burdensome" at that rate? Are the "robot guidelines" no longer "guidelines"
> but "rules" and are these rules applicable to all forms of automated access, ev
> en if they aren't robots?
Yes, No, depends who you are, no comment (thinking about that one).
Chris,
chris@jm-crowther.co.uk
www.dungeon.com/~jmcrowther/chris.html
ChegHchu djajVam djajKak!