Re: Defenses against bad robots

mred@neosoft.com
Fri, 17 May 1996 23:03:22 CST


** Reply to note from Benjamin Franz <snowhare@netimages.com> 05/17/96 5:21pm
-0700

> It would be better to use a directory of static web pages with static
> links. A few hundred chained together with the last pointing to a CGI
> script to notify you of the trip. That way you don't pay any CGI
> load penalty unless the trip actually happens. Have the CGI record all the
> pertinant info and mail it to you. A short script could easily generate
> a few hundred chained static pages in a matter of seconds. Add that
> directory to your robots.txt file and the only thing you should see is
> rogue bots.

Assuming the robot goes a few hundred levels deep. If it's a well-written
malicious robot, it will be doing breath-first gathering; in which case, I
seriously doubt it's going to get that far. If it does, your server will have
already taken quite a huge hit, since the robot has probably retrieved
everything you have before finally hitting the trap.

-Ed-
mred@neosoft.com
http://www.neosoft.com/~mred/