Surely you must realize the importance of definitions.
When we can't agree on what a "robot" is, how can we agree
on what it should do, how it should behave, etc?
It's still unclear to me what the difference is, between
a person manually browsing pages, and a person manually
instructing their "agent" to browse those same pages.
When the "agent" is based on a server, then (and in my
opinion, ONLY then), can one seriously believe that the
program can rapid-fire requests to another server.
Taking ActiveAgent as an example, on a 28800 baud modem,
experience shows that it processes about 1-2 pages/second.
Any server which can not withstand that, should really be
considering an upgrade.
As for access restrictions, a standard for that is already
in place. The Unix access-priviledges. Don't want people
(and/or their "agents") to access something? Then, do NOT
give that something "world" readability. My interpretation
of the word "world" includes users AND their robots, agents,
spiders, or whatever you want to call them.
... Robert
_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html