Re: Suggestion to help robots and sites coexist a little better

Scott 'Webster' Wood (swood@thewild.com)
Wed, 17 Jul 1996 07:48:39 -0400 (EDT)


>
> Second, this doesn't solve one of the really big problems, which is that
> many people who publish information on the Web don't have any kind of
> administrative access to the server. They can't modify robots.txt, they
> can't set any server parameters... nor should they be able to.
>
Why not propose a form of a robots.txt or some other file that can
include information for robots at the root of each directory? Such a file
could be put in user directories and include the necessary information on
each resource available thus preventing the need for the robot to download
all the information therein for indexing. A simpler approach might even
just include instructions to allow/prevent the robot to access specific
pages in that tree.

This approach could obviously increase server loads on systems with
complex directory structures, but I imagine that instructions in the
administrations robots.txt on the root of the server itself could tell
any entering robot whether or not the server itself is setup to support
additional instructions per directory. Thus telling the robot whether
or not it should look for special instructions in the individual directory
trees that may be controlled by each user.

Scott