Re: Safe Methods

Benjamin Franz (snowhare@netimages.com)
Thu, 18 Jul 1996 07:15:48 -0700 (PDT)


On Thu, 18 Jul 1996, Rob Hartill wrote:

> "safe" in this context presumably relates to file upload type operations
> and not "safe" = static HTML files.

Safe in this context means that submitting GET or HEAD requests should not
change the state of the server for future requests. IOW - they should
not have 'side effects'.

> If you need written 'justification' for acting against other people's wishes
> in the way you use *their* services, then shame on you.
>
> The only reason POST is "safe" at the moment is that the bloody-minded and
> clueless section of the robot community are too incompetent to work out
> how to correctly handle POST. I'm pretty sure those morons would be doing
> POST too if they could figure it out.

POST is "safe" because it is a red flag to robots that it calls CGI that
can result in dynamic content that may vary with each call. So clueless
site owners who can't figure out that robots are quite likely to follow
HREF (aka GET) links are protected from themselves by very clued in robot
owners. And you are scarcely winning friends among the group of people
*you* are asking favors from by insulting them.

> Would the people who disagree with me please list their robot's IP numbers
> on this list so that adequate protection can be made before you visit my
> site.... mine's obviously an 'unsafe' site because it's designed for those
> funny things called 'humans'.

By someone who is too clueless to figure out how to design a site
correctly - or even how to escape the 'paper' model of presentation for
the device independant content markup of HTML, this is pretty funny.
(Hint: trying to achieve the *same* appearance in every browser is
impossible. Worse it will break in unexpected places. Hint: presenting
different content based on User-Agent is a death trap for maintainability.
There are *hundreds* of user agents and versions with more coming out
daily. Hint: many browsers *lie* about who they are. Hint: Presenting
*different* content by User-Agent can break caching proxies when using
GET. Hint: This is a *bad* thing. Some entire COUNTRIES are behind caching
proxies.) Or how to use forms to achieve 'robot safe' virtual trees with
good visual layout (Hint: look up INPUT TYPE=image). Or how to produce
valid HTML that does not produce unpredictable results in browsers (Hint:
tables can't go inside paragraphs. Hint: attribute values can't include
unquoted '#' and '%' characters. Hint: DOCTYPEs are your friend. Hint: not
closing <TD> and <TR> tags can result in unpredictable behavior in some
versions of Netscape, at least.)

-- 
Benjamin Franz