Re: It's not only robots we have to worry about ...

Rob Hartill (robh@imdb.com)
Thu, 26 Dec 1996 17:57:26 +0000 (GMT)


Benjamin Franz wrote:

>You are overreacting to NetJet.

I don't think so after what I've had to deal with from some NetJet users.
It might look like an overreaction for most sites, but remember that these
are early days for "accelerators". Every week the number of users grows.
Critical mass can't be too far away. There will come a point where prefetching
becomes a significant overhead for servers to deal with.

My biggest fear is that either Microsoft or Netscape will look at the
short-term gains and use prefetching as a way to get a bigger market
share. That really would spell the end of the web as we know it. The idea
of accelerators doesn't scale.

I think the only long-term solution is to put some form of "terms and
conditions" headers into HTTP that clients agree to when talking to a
server and any client that doesn't agree to the conditions is denied
access. This could be useful yo guide robots, but it's personalised
agents that are the biggest problem on the horizon.

_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html