i disagree with that kind of solution. i think the problem here is the
way robots handle CGI programs. there is no question, IMHO, that the web
is going to become more and more complex, and that more and more (if not
most) web sites will actually be dynamic, or somehow generated. what is
needed is a way to tell the difference between pages that are generated
and static and pages that are generated on the fly and are highly
dynamic. Any ideas out there as to how to achieve this?
Martin
_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html