Re: Standard

John D. Pritchard (jdp@cs.columbia.edu)
Wed, 27 Nov 1996 05:06:33 -0500


> I've been following the robots mailing list since October and given the
> current thread, I support HipCrime's position 100%. The robot exclusion
> standard, while a nice gesture, seems to be a waste of time. The way I
> see it, robots.txt is a convenience for the robot, not a method to
> restrict its access. Speaking as a user, I really don't care what burden
> I place on a distant server as long as I acquire the information I want.
> Even if "Obey Robot Exclusion" was an option on the agent, I wouldn't use
> it. And why should I? What is the penalty for not obeying? Perhaps
> this list needs to refocus on positive/negative reinforcement as the
> solution. (People comply with the speed limit only because they risk
> expensive tickets.)

<FLAME>

oh, it's so genet of you to be so open with your sadism. was it good for
you?
</FLAME>


any service, eg, wwweb service, will have to protect itself from repetitive
tcp connections constituting denial of service attacks. wwweb servers will
have to check sockets' origins before accepting them.

they should also start using tcp wrappers.

but this isn't the topic of this list.

the topic of this list is using and developing "/robots.txt".

<FLAME>

but i guess if you knew what you were talking about you would have known
the difference between a tcp problem and an http problem.

</FLAME>

-john

_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html