> Erik Selberg <selberg@cs.washington.edu> wrote:
>
> > True, although by adding categories, we can define "robot" as
> > anything non-human, e.g. proxies, page watchers, indexers, etc.
>
> By that definition, Netscape Navigator and any other hypertext
> browser is "non-human". How much human interaction with the
> program is required, to define its actions as "human"? Or, how
> much autonomy must a program be allowed, for it to be defined as
> "robotic"?
I must have missed something! Does Netscape Navigator check the robots.txt
file to see whether it can fetch a page? I was under the impression that
an agent is a robot if its creators *say* it's a robot. (Of course, it
might take some friendly persuasion from the entire WWW to convince them
that what they have constructed is in fact a robot.)
-- LLLL LLLLL LLLLLL Arthur Matheny LIB 612 LL LL LL LL LL Academic Computing University of South Florida LLLLLL LLLLL LL matheny@usf.edu Tampa, FL 33620 LL LL LL LL LL 813-974-1795 FAX: 813-974-1799 LL LL LL LL LL http://www.acomp.usf.edu/_________________________________________________ This messages was sent by the robots mailing list. To unsubscribe, send mail to robots-request@webcrawler.com with the word "unsubscribe" in the body. For more info see http://info.webcrawler.com/mak/projects/robots/robots.html