Re: RFC, draft 1

Martijn Koster (m.koster@webcrawler.com)
Tue, 19 Nov 1996 11:29:40 -0800


At 2:15 PM 11/16/96, Darren Hardy wrote:

>The Allow/Disallow rules are still geared toward excluding
>resources from robots. If we added a 'Suggest: URL' feature,

Which I'd like to.

>I'm not sure that Visit-Time is the best way to throttle robots.
>Request-Rate also sounds too difficult for Robots to implement unless
>it was very simple like 1000 URLs / day, but 10 URLs / minute is too
>fine-grained. Then, I'd question its usefulness.

Agreed.

-- Martijn

Email: m.koster@webcrawler.com
WWW: http://info.webcrawler.com/mak/mak.html

_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html