>The general problem is that we rewrite URLs to contain a session key
Indeed :-)
>Any ideas of how to deal with this type of problem.
Can't you simply not do the session key for visiting robots, discriminating by
User-agent? Scanning for "Robot" will catch at least WebCrawler :-)
-- Martijn
Email: m.koster@webcrawler.com
WWW: http://info.webcrawler.com/mak/mak.html
_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html