Second, this doesn't solve one of the really big problems, which is that
many people who publish information on the Web don't have any kind of
administrative access to the server. They can't modify robots.txt, they
can't set any server parameters... nor should they be able to.
This is why Fuzzy Mauldin led an effort at the W3C workshop on distributed
search to come up with robot instructions that could be encoded in
individual pages. This way, even if a thousand people are publishing on a
single server, each one has the ability to tell robots what to do with
regard to their pages.
If the servers want to leverage that info somehow, that'll come, but any
proposal like this has to take into account the limited privileges of Web
publishers.
Nick