> 459 HEAD requests produce 459 server child-forks. Many of the requests
> will operate in papallel. If the server has an upper limit in forking maybe
> the site will be saved, else the paging volume will be full.
With a two minute delay between each HEAD?
I don't think so.
I am not even sure that this process can be called a robot.
The next thing now is that netscape is obliged to read robot.txt
to know if it can automatically get the images refered to by
a page, or that cache-servers need to read robot.txt to know if
it is allowed to serve a given page.
-- Bjørn-Olav Strand - bolav@skiinfo.no http://www.skiinfo.no/webinfo/ - http://www.skiinfo.no/