Re: Horror story

Ted Sullivan (tsullivan@blizzard.snowymtn.com)
Sun, 14 Jan 1996 17:38:00 -0800


3. Every robot site that supports URL inputs should mention robots.txt
in both the submission form and the submission response page.

How about somebody, say maybe Louis as he would certainly have the resources
(I would do it but don't have a budget for that kind of stuff, unless of
course somebody came up with a few man weeks) run your little spider again
against your complete data set of URL's and while you are looking for links
on ONLY the top most page of the site record any mail address that has
"webmaster..." or something similar in it.

Then send a little message to the webmaster saying in effect that a group of
sites that use spiders (mention a few of the big one that are on this
mailing list) have informally got together and in order to properly index
your site in the future would like to make a little suggestion.... have
noticed that you do not have a robots.txt file.... Then include a short
example that the webmaster could cut and paste into the file system with a
should tutorial on how to do it.

Two things would happen, 1) lots of people would get the message and 2) if
you sent out >100,000 e-mail messages one weekend surely some of the trade
publications would write up a few articles on our behalf after their
webmaster got a message and realized that it was send to the world.

We would not hit everybody, but it would sure hit a lot of the sites that
could do the 2 minute piece of work and set it up properly.

Ted Sullivan
tsullivan@snowymtn.com