>> Unfortunately, it is currently impossible to tell robots which of
>> domain name should be used for a particular site. Consequently, a robot
>> can continue to index under an obsolete name, until the domain actually
>> disappears and all the references become invalid.
I asked about this some time ago and was advised to include
<Meta http-equiv="URI" contents="http://the.proper/location.html">
in the HEAD. I also used <Base Href.....> to make sure all links pointed
to the new location. Problem is I'm not sure whether it helped :-)
The next step I did was run a CGI script on the old machine which
reported a 302 to any known robots and a redirect to others.
I still had to email the maintainers of the search engines to drop the
old pages. I did have my mail address in the header of each page so that
the maintainers could mail and confirm that the 'delete' request was not
spoofed email... It still took 6 to 9 months to effectively complete the
move.
Anybody care to comment on these (or other) techniques? The Web
does seem to need a way of handling moves and renames... What will HTTP
1.1 bring?
-- Martin Kiff mgk@webfeet.co.uk