>The Allow/Disallow rules are still geared toward excluding
>resources from robots. If we added a 'Suggest: URL' feature,
Which I'd like to.
>I'm not sure that Visit-Time is the best way to throttle robots.
>Request-Rate also sounds too difficult for Robots to implement unless
>it was very simple like 1000 URLs / day, but 10 URLs / minute is too
>fine-grained. Then, I'd question its usefulness.
Agreed.
-- Martijn
Email: m.koster@webcrawler.com
WWW: http://info.webcrawler.com/mak/mak.html
_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html