I'm a bit confused about how often I should update my local copy of a site's
/robots.txt file. Clearly I shouldn't check it with each access, since that
would double the number of accesses my robot would make to a site.
I saw nothing in my server's access logs that would suggest that any of the
robots that visit our site ever perform a HEAD request for /robots.txt
(indicating they were checking for a Last-modified header).
So how about it? How often should /robots.txt be checked?
Thx,
Skip Montanaro skip@calendar.com (518)372-5583
Musi-Cal: http://www.calendar.com/concerts/ or mailto:concerts@calendar.com
Internet Conference Calendar: http://www.calendar.com/conferences/
>>> ZLDF: http://www.netresponse.com/zldf <<<