> > I therefore propose adding two new field to robots.txt, to indicate both
> > preferred and obsolete domain names for a given server. For example:
> >
> > use-domain: preferred.domain.name
> > obsolete-domain: obsolete.domain.name
>
> This seems like a really useful idea. It certainly isn't robot exclusion, but
> robots.txt seems as appropriate a place as any for it. (It could go in another
> file as well) I'd like to see something like this though:
>
> use-name: myhost.mydomain.edu
>
> Which would allow a robot to know what hostname to return for the URLs on that
> site when that host goes by many different names.
>
> Anyone?
>
> Issac
i would like to suggest that redirecting robots should be done actively,
via a protocol, while use-name can be done passively, via robots.txt.
shaping the use of such a protocol into something reasonable, avoiding the
global spam "my hostname changed!", could be done via the LINK protocol
mentioned in another post to robots.
the host problem would just be the briefest link spec, likewise the
hostname alone is the root URL for a server.
-john