Re: Defenses against bad robots

(ingram@netcom.com)
Sun, 19 May 1996 16:12:41 -0700 (PDT)


On Fri, 17 May 1996 kathy@accessone.com wrote:

> Deliberately misconfigured robots can cause denial of service
> attacks on websites.
>
<snip>
> (D) Capture the robot in an infinite loop trap that does not
> use too much resources. In the event that the infinite loop
> trap is tripped, a message is printed to a file, etc. Please
> reply with examples of simple infinite loops.
>
<snip>

Say that one discovers an ``infinite loop trap'' or anything
else on your list for that matter. Let's take the infinite
loop trap and ask a question:

How do you stop a server caught in an infinite loop clear
accross the world, from hitting your sight repeatedly? If
you contact the site admin in another country and there is no
response, then what? Is there a way to refuse requests on
your end, or is the very act of refusing a given server
essentially putting a load on your server?

It seems to me that the global Internet should have a method
where a runaway server can be cutoff from the Internet, that
is to say, if their system admin doesn't respond to repeated
requests in a reasonable time to stop their runaway server.
Is there such a central body who can be contacted in such
emergency situations?

--------------------------------------------------------------
ingram@netcom.com
==============================================================