You mean file?
>Instead of keeping robots at bay why not have a
>protocol which allows permissions? A site could allow access but at
>certain levels.
Yup, this would be handy, and has been proposed many times.
But obviously not soo handy that operators of large robots
have invested the time into implementation (yeah, WebCrawler included :-/)
>Perhaps keywords could be used so that the program would
>only go so far. There are problems with this, who determines the
>keywords but why not a robot thesaurus of some sort. It can be done if
>there's a protocol perhaps. Just an idea, any comments?
I don't understand what you mean with "keywords" here...
-- Martijn
Email: m.koster@webcrawler.com
WWW: http://info.webcrawler.com/mak/mak.html
will catch
a lot especially if you go sit on your Internet link.
-- Martijn
Email: m.koster@webcrawler.com
WWW: http://info.webcrawler.com/mak/mak.html