Out of approximately 160,000 discrete sites visited by fido (the robot
for PlanetSearch), roughly 1.5% of the sites have a robots.txt
file.
Of those that have one, they average about 4.5 lines long.
My feeling is that until the creation and maintenance of
robots.txt files is automated, only people who truly understand the
implications of robots (or whose sites are pounded mercilessly by
ill-behaved robots) will use them.
If Netscape, Apache, etc. provided a simple-to-use tool for
restricting access, etc., then used that to generate a robots.txt file,
I expect we'd see more of them in use, and more that are syntactically
correct. The syntactic variations in existing robots.txt files are
truly amazing.
Steve
-- Steve DeJarnett Internet: steved@planetsearch.com PlanetSearch http://www.planetsearch.com/ A service of the Philips Multimedia Center _________________________________________________ This messages was sent by the robots mailing list. To unsubscribe, send mail to robots-request@webcrawler.com with the word "unsubscribe" in the body. For more info see http://info.webcrawler.com/mak/projects/robots/robots.html