> I was thinking about this recently, perhaps a new SRE directive would help, to
> indicate at what time of the day / week / month etc a robot can access the
> site, and how many pages / bytes they can retrieve per access.
>
>
I like this. The web masters should have an idea of their load levels and
could exclude robots during some times.
For now a few cron jobs could switch between a few robots.txt files during
appropriate times of the day.
======================================================================
Jeremy Sigmon B.S. ChE |
Web Developer of the Robert C. Byrd Health | Use
Sciences Center of West Virginia University | FreeBSD
WWW.HSC.WVU.EDU | Now
Graduate Student in Computer Science |
Office : 293-1060 |