Re: Broadness of Robots.txt (Re: Washington again !!!)

Hallvard B Furuseth (h.b.furuseth@usit.uio.no)
Fri, 22 Nov 1996 12:44:19 +0100 (MET)


At Thu, 21 Nov 1996 12:53, Martijn Koster wrote:
>At 6:50 PM 11/20/96, Captain Napalm wrote:
>
> You could even play tricks by defining your categories as User-agents,
> and look for a list in turn, like I mentioned before:

Yup.

> Then a robot could search for its own name, then its categories,
> then '*'.

No need. If you don't like Yet Another Field Name for categories, just
let a robot search for the first record matching its name *or* category.
If you then list names before categories in robots.txt, names will take
precedence.

>> Then again, how many different rule sets does a typical robots.txt file
>>have? Also, do specific rules for a robot override the "global rules"?
>>Maybe not ...
>
> Agreed :-)

Has some robot owner collected any statistics on robots.txt?
Mean, standard deviation, max of: Disallows per User-Agent, User-Agents
per robots.txt, size of robots.txt, maybe length of Disallow lines.

Regards,

Hallvard
_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html