Generating robot.txt seems like something that would be appropriate
for automated tools. Just thinking off the top of my head, it
seems it wouldn't be too hard to create a tool that ran on one's
server, and that checked for dynamically created links (say, by
comparing what is in the directory tree against what an html link
produces?), and then automatically updates the robot.txt accordingly.
The server vendors could include this in their software
and put in their advertising that their product is
robot-safe or some such thing...
The server manager wouldn't have to do a thing...
PF