Re: nastygram from xxx.lanl.gov

Garth T Kidd (garth@dogbert.systems.sa.gov.au)
Thu, 11 Jul 1996 14:33:45 -1000


> This covers one class of robot.. the type that ignores robots.txt and
> has half a brain cell when it comes to "?" in URLs. Now what about the
> other dumb robots (there's a never ending supply) that don't have
> this state-of-the-art intelligent ability to look for "?"s

Fix those robots to behave, as you said. In the mean time, stop passing the
buck and fix the web site.

Robot authors need to come up with ways of limiting the damage that occurs
when their spiders run into an unmarked black hole maintained by someone
that has never heard of REP. Lots of people on this list are doing work in
that direction.

Site designers must do their bit, too, designing their sites so that the
damage is minimal when they get hit by a rogue spiders. Mail-bombing and
other childish behaviour is not the way of doing it. Smart sites are.

xxx.lanl.gov have obviously done the work required to identify a rogue
spider. Why didn't they just rig their server to drop connections from
identified rogues, or return 403? Why on earth isn't their CGI software
bothering to make a simple check for roguedom before backgrounding these
massive tasks?

Instead of quietly implementing a simple technical solution to their
problem, xxx.lanl.gov have chosen to be obnoxious about it. This makes it
difficult to feel sympathy for their plight.

--
garth@dogbert.systems.sa.gov.au  | Garth Kidd
  +61-8-207-7740 (voice)         | Network Services
  +61-8-207-7860 (fax)           | EDS
 +61-414-300-213 (mobile)        | Adelaide, AUSTRALIA