Re: Horror story

Skip Montanaro (skip@automatrix.com)
Sun, 14 Jan 1996 15:05:31 -0500



Seriously, how about some sort of concerted effort to educate webmasters
everywhere that with two minutes of their time they can make everyone's life
better: less visits to their site, less junk in indexes (indices?).

Sounds good in principal. In practice, howver, since most people want to
tout as many "hits" as possible, they may be less inclined than you might
think to squelch robots.

Here are a few suggestions:

1. Every site that uses robots (Lycos, Alta Vista, Webcrawler, ...)
should have an easily found link to Martijn's norobots page. I know
some do already. Others mention robots.txt but don't provide a link.

2. If possible, expose several "load-and-go" annotated robots.txt files
(maybe on Martijn's site), each with clear statements of the particular
file's goals. I know there are a few on the norobots page, but I doubt
there are very many sites with /cyberworld directories.

3. Every robot site that supports URL inputs should mention robots.txt
in both the submission form and the submission response page.

4. How about a robots.txt creation Web form?

5. Are there some good non-Web places to get a little publicity? What
about WebWeek, Interactive Age, and other Internet rags? Could a short
article be written?

6. All the major Web servers should come with a little blurb about
robots.txt.

7. How about a little IMG like the Point Communications Top 5% graphic
that points to the norobots site? Anybody with a robots.txt file could
display it proudly (we are, after all a pretty elite group if Louis's
message is indicative of reality). It could have little image and a
catchy phrase like:

robots.txt - the diagraphm for your Web server

--
Skip Montanaro     |  Looking for a place to promote your music venue, new CD
skip@calendar.com  |  or next concert tour?  Place a focused banner ad in
(518)372-5583      |  Musi-Cal!  http://www.calendar.com/concerts/