He accused my "robot" of violating the "robot guidelines". He didn't
enumerate which I violated. I'm guessing he may have been upset that the
test was ignoring his robot.txt, but since the test wasn't traversing the
general web space and was in no danger of looping or getting lost, there wasn't
much point. The test was operating from a fixed list of URLs that once tested
would be discarded.
He also informed me that "we have no need for you to 'index' our site" only
to then rebuke me for running "a particularly stupid robots that only does
pointless HEADs". I didn't point out that the two would naturally be mutually
exclusive. :)
Anyway, he naturally asked that I "cease and desist", which I'm of course I'm
happy to do.
Comments? Should such accesses as mine also test robots.txt? Were my accesses
"burdensome" at that rate? Are the "robot guidelines" no longer "guidelines"
but "rules" and are these rules applicable to all forms of automated access, even if
they aren't robots?
-- nabil@i.net http://www.i.net/aaron/