Re: in-document directive to discourage indexing ?

arutgers (arutgers@asterix.helix.net)
Fri, 21 Jun 1996 15:01:44 -0700


Hi
I think that some thought should be put into adding more tags
for robot use only. HTML (and the images, sound, Java etc. that go with
it) makes up a large portion of the Internet traffic and only a small
part of it is for robots. The more that is added for robots alone, the
more bandwidth is consumed sending that information to users who don't
want it and probably don't know it exists. One obvious solution that was
discussed was putting the meta-info in another file, this is however
difficult to maintain.
Though it could work if the server automatically stripped the
meta information when the requests are from a non-robot. Files that are
frequently used would probably be saved with a demetaed version, having
a normal version and a '.dmhtml'? (demetaed) version that is sent to
users and updated from the original automatically by the server whenever
the file is changed, or it could be checked every week. It would take
more CPU time and disk space but less bandwidth. This would also mean
that in order to get the meta-info the robot would have to identify
itself as a robot. One would have to talk server makers into doing this,
they could say their servers are robot friendly and bandwidth
conservers.

Andrew