Re: an article... (was: Re: Standard?)

Nigel Rantor (wiggly@deepsea.sys.web-uk.net)
Mon, 2 Dec 1996 03:34:23 +0000 (GMT)


On Fri, 29 Nov 1996, Greg Fenton wrote:

:I am wondering if anyone has read the article:
:www.yahoo.com/headlines/961127/entertainment/stories/online_livewire_1.html
:
:They are talking about both NetJet and Blaze. Here is a quote from
:that article:
:
: "When someone is reading a Web page, the preloaded browsing feature
: in both programs scans the page for hot links to related pages, then
: downloads those pages to cache memory so they'll load faster if the
: user decides to view them."
:
:Now, if this becomes an enhanced feature for web-browsers, then
:I suspect that rogue-robots will not be the worst of our problems ...
:Web-users will.

Well, to be honest this kind of seems like the next logical step for
browser developers, I'm just surprised it hasn't happened sooner. Taking
the webmasters position though why not limit simultaneous
connections/connections per [insert time period here] from the same IP
address? If browser vendors decide to start implementing features which
could spam servers then server developers should fight back if it become
neccessary, and in the end it is the server that decides whether or not to
serve a particular client, and how much bandwidth to allocate that client.

Lets expect an Apache module rsn.

And on the subject of my posting and replies I have read so far. I kind of
agree, I'm gonna think about it, talk to the person responsible for the
system that was to use its services and then decide whether or not there
is any justification for it.

Nige

+--------------------------------------------------------------------+
| Nigel A Rantor | WEB Ltd |
| e-Mail: nigel@mail.bogo.co.uk | The Pall Mall Deposit |
| | 124-128 Barlby Road |
| Tel: 0181-960-3050 | London W10 6BL |
+--------------------------------------------------------------------+
| She lies and says shes in love with him, |
| Can't find a better man, |
| Better Man - Pearl Jam |
+--------------------------------------------------------------------+

_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html