Since loop avoidance is not the only reason one might want
to restrict robots, I believe that robots.txt should always
be honored. Paticularly if his robots.txt says effectively
"no robots period" (please pardon me for not knowing the
exact syntax for this), then almost certainly his reason is
not just loop detection (unless, I suppose, every URL of his
results in a loop).
If I personally had gone to the trouble of setting up a
robots.txt file, and a robot of any flavor ignored it, I'd
also be annoyed.
PF
ps. By the way, I hear that people in America will sue
at the drop of a cup of coffee, so if I were running a
robot, I'd think it in my own best interests to honor the
robots.txt file.