> At 9:43 AM 11/21/96, Erik Selberg wrote:
>
> >Hmm... rather than using a User-Agent, perhaps using a User-Type
> >similar to Content-type (someone else made a similar suggestion, now
> >that I think about it). For example:
>
>
> Well, there might actually be benefit to that beyond /robots.txt
>
> Maybe the categories for the robots database can be a starting point:
>
[snip]
>
> We are arguing about the definition for a robot ever other day;
> now we can start arguing about the definiiton for 20 categories :-)
True, although by adding categories, we can define "robot" as anything
non-human, e.g. proxies, page watchers, indexers, etc. Think of it as
trading one big argument for lots of little ones. :)
-Erik
-- Erik Selberg "I get by with a little help selberg@cs.washington.edu from my friends." http://www.cs.washington.edu/homes/selberg _________________________________________________ This messages was sent by the robots mailing list. To unsubscribe, send mail to robots-request@webcrawler.com with the word "unsubscribe" in the body. For more info see http://info.webcrawler.com/mak/projects/robots/robots.html