libwww is a perl library which has its own robots.txt handling code,
although that doesn't mean it's being used in this case. I think 4.x is a
version for Perl 4 and 5.x for Perl 5.
I've just had a look through the code which handles robot exclusions in
libwww 5.04 and at what I assume is the robots.txt you refer to
(http://www.imdb.com/robots.txt) and I can't see any obvious problems, and a
quick test program confirms this.
Olly
_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html