> Define a "page" !
A single HTML document, returned by a single request to
a server's hypertext transport daemon via a URL of the
form:
http://some_server/some_file
And, this is what ActiveAgent can process at the rate of
about 1-2 per second (on a 130Mhz Pentium with a 28800 modem
connection).
> click <here> to turn the satelite 0.5 degrees, map the surface
> of the earth in a 100 mile radius and image process the results
> to determine how many trees have fallen this week...
Does anybody really believe there's a page somewhere like this?
Maybe it's naivete', but this seems a tad-bit overstated to me.
The overwhelming majority of pages are requests for archived
data, some are requests for execution of a CGI program. How
many are requests for control of multi-million dollar equipment?
... Robert
_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html