kbackdraft-bbn.infoseek.com - - [04/Nov/1996:01:27:35 -0500] "GET /robots.txt
HTTP/1.0" 404 207
backdraft-bbn.infoseek.com - - [04/Nov/1996:01:27:40 -0500] "GET
/list_archives/webph/0066.html HTTP/1.0" 200 2642
backdraft-bbn.infoseek.com - - [04/Nov/1996:01:28:04 -0500] "GET
/list_archives/webph/0069.html HTTP/1.0" 200 3181
backdraft-bbn.infoseek.com - - [04/Nov/1996:01:30:19 -0500] "GET /robots.txt
HTTP/1.0" 404 207
backdraft-bbn.infoseek.com - - [04/Nov/1996:01:30:19 -0500] "GET
/~como/historia/alblit.shtml HTTP/1.0" 404 207
backdraft-bbn.infoseek.com - - [04/Nov/1996:01:31:09 -0500] "GET
/list_archives/
webph/0062.html HTTP/1.0" 200 4653
backdraft-bbn.infoseek.com - - [04/Nov/1996:01:31:26 -0500] "GET /robots.txt
HTTP/1.0" 404 207
backdraft-bbn.infoseek.com - - [04/Nov/1996:01:31:27 -0500] "GET
/list_archives/
their robot is requesting robots.txt every single time it wants to get
something from our server ! What's the deal with that ? Our server (and
their robot) must do twice the work - bad.
does anyone know if this is just a temporary problem or if it has always been
like this ?
Thanks.
Otis
_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html