Re: An extended version of the Robots...

Hallvard B Furuseth (h.b.furuseth@usit.uio.no)
Fri, 15 Nov 1996 13:25:10 +0100 (MET)


On Tue, 12 Nov 1996, Art Matheny wrote:
>
> There appears to be a problem with throwing *all* unknown robots into the
> same default rule set regardless of whether they understand the 2.0 spec
> or not. Has anyone suggested a special code for default version 2
> robots? For example:
>
> User-agent: other2_0
> Robot-version: 2.0
> Allow: *index.html
> Disallow: /images*
> ...
>
> User-agent: *
> Disallow: /images
> ...

There is no problem if we don't change the meaning of `Disallow:', at
least not with that example. Then we can just say

User-agent: *
Robot-version: 2.0
Allow: *index.html
Disallow: /images

An 1.0 robot will ignore Robot-Version: and Allow:.

> A version 2 compliant robot would recognize "other2_0" and use that
> rule set, but a version 1 robot would ignore that rule set and use the
> bottom one.

That will work, of course. However, we'd have the same problem with
specific user-agents, so it must either be possible to specify the robot
version with other user agents as well:
User-agent: 2.0:thatspider
or User-Agent-2.0: thatspider
Or robots.txt authors must spend time to find out the robot version of
each User-Agent. I'd prefer the former.

Regards,

Hallvard
_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html