RE: Client Robot 'Ranjan'

chris cobb (c-cobb@ix.netcom.com)
Thu, 20 Jun 1996 14:39:04 -0700


Unfortunately, I believe this and other associated problems will likely
give rise to usage rates for most sites imposed by those who carry
the traffic. I expect that the structure of these rates would probably
be designed so that the average user would not incur extra charges,
but excessive use (generally by a business)
would cost more. You can already see this type of
result in phone lines - residential are $20 or so, business lines
are $100 because of the greater expected use.

This would enforce the timeless free market law that any entity
which uses resources must produce at least as much (i.e. money)
to be viable in the long term. Joe Sixpack running a crawler for
fun would be forced to either justify the value to society of his
activities (by getting money in some way for his efforts) or
support the cost of used bandwidth out of his own pocket.
(Perhaps the personal educational value of his crawler is worth
the expense in his mind.)

Of course this all goes against the free standing nature of the Internet,
especially when I talk about money, but it recognizes fundamental
principles. Hopefully, we will always
have enough bandwidth to allow each and everyone to easily engage
in the activities of their choice, but this may not be possible. An interesting
quote I read recently pertained to the great demand for ISDN, cable modems, high
speed lines, etc. It said that as soon as everyone got high speed access,
the whole system would grind to a halt from the overload.

Maybe, maybe not, but it is an interesting point.

Chris Cobb
----------
From: Benjamin Franz[SMTP:snowhare@netimages.com]
Sent: Thursday, June 20, 1996 5:34 AM
To: robots@webcrawler.com
Subject: Re: Client Robot 'Ranjan'

On Thu, 20 Jun 1996, Kevin Hoogheem wrote:

> >what are the implications of the general public having a Webcrawler?

[snip]
>
> lets not look at what will happen when everyone will have one that
> is the same question people ask about every product, what will happen
> when everyone can own a car, oh no, the railroads will go out
> of business, what will happen when someone makes the first
> personal computer, oh no mainframes become obsolete.

You have missed the point. It is not about the economic impact of personal
webcrawlers on the big indexes. It is about the problem of umpteen
*MILLION* robots all attempting to access every site on the web. The
small number of robots today make a noticible (albeit small) dent in
overall system loading of websites. One that is more than offset by their
usefulness to sites as resource discovery. But there are only a double
handful of them running. If as few as two hundred comprehensive and fast
independant webcrawling robots like Scout were set loose the traffic they
generated on would swamp everything else percentage wise. This would make
them too much of a burden on the websites they index - sites would begin
implementing measures to block all robots except the big known ones. A
system such as Harvest where duplication of effort is minimumized though
sharing of gathered information is required to solve this problem.

-- 
Benjamin Franz