I follow this list occasionally, so forgive me if someone already does
this.
Wouldn't one solution be a website that had a link such as "Make
ROBOTS.TXT" which, when selected, gave you an HTML Form for specifying
the basic options that go into ROBOTS.TXT and when a submit button was
clicked, emailed you the file that was generated. Seems like a pretty
simple Perl application to me. If anyone is interested, email me ...
Scott Johnson
Chief Technology Strategist
Dataware Technologies
>----------
>From: Steve DeJarnett[SMTP:steved@pmc.philips.com]
>Sent: Thursday, November 14, 1996 2:51 PM
>To: robh@imdb.com
>Cc: robots@webcrawler.com
>Subject: Re: changes to robots.txt
>
>Rob Hartill wrote:
>>
>> Out of curiosity, how widely used is robots.txt ? Anyone have some
>> figures ?.
>
> Out of approximately 160,000 discrete sites visited by fido (the robot
>for PlanetSearch), roughly 1.5% of the sites have a robots.txt
>file.
>
> Of those that have one, they average about 4.5 lines long.
>
> My feeling is that until the creation and maintenance of
>robots.txt files is automated, only people who truly understand the
>implications of robots (or whose sites are pounded mercilessly by
>ill-behaved robots) will use them.
>
> If Netscape, Apache, etc. provided a simple-to-use tool for
>restricting access, etc., then used that to generate a robots.txt file,
>I expect we'd see more of them in use, and more that are syntactically
>correct. The syntactic variations in existing robots.txt files are
>truly amazing.
>
> Steve
>
>--
>Steve DeJarnett Internet: steved@planetsearch.com
>PlanetSearch http://www.planetsearch.com/
> A service of the Philips Multimedia Center
>_________________________________________________
>This messages was sent by the robots mailing list. To unsubscribe, send mail
>to robots-request@webcrawler.com with the word "unsubscribe" in the body.
>For more info see http://info.webcrawler.com/mak/projects/robots/robots.html
>
_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html