Re: Defenses against bad robots

John D. Pritchard (jdp@cs.columbia.edu)
Mon, 20 May 1996 13:44:11 -0400


> >(E) Trap the robot into retrieving "a gigabyte-size HTML
> >document generated on-the-fly" (1). Please reply with
> >examples of this technique.
>
> Instead of sending volumes of data that use network bandwidth
> that could alternatively be used for productive work, maybe
> delaying responses could keep such a robot occupied waiting
> for them. The brute-force voluminous data approach is fine for

it's really not even Ok "today"

> I'm curious if anyone knows a practical amount of time after
> submitting a request before such a robot gives up and decides
> that the "target" is not responding.

i think most people tend to write a wait between 30 seconds and a couple of
minutes for network delays.

-john