How so? How does Usenet or a list like this one guarantee all
of the features one demands from a well thought out infrastructure
like
robust reliable technology
ubiquitious information distribution
guarantee of access and retrieval
scalable predictable performance
reliable directory services*?
Sadly, we have none of this.
Until we do, and far more importantly, until we have a PLAN, a
strategy to reach the time when we can reliably and predictably
serve and retrieve the information we need, the web will
continue to be a mishmash of broken links, unreachable servers,
404 messages and doomed ideas.
Now, perhaps you think this is a harsh view and perhaps it is.
But the thing that really bothers me about all of this is that
no one seems to be looking to the future. No one seems to want
to think about what happens when network load becomes enough to
force packet-charging or the only players who can afford to
provide access and web service are the longhauls or RBOCs, or
when "some large company" comes on the scene appearing to
provide all of the things we cannot provide, only to hold the
entirety of the net hostage to proprietary technology, or when
the niftiest technologies we can imagine and implement only
serve to hasten the end of the ride.
Ok, I see listmembers shaking their heads, wondering what this
has to do with robots and spiders. And that, my friends, is
the saddest thing of all.
</rr>
(* By the way, if you take a look at what Netscape is doing
next, there's an interesting white paper on their site which
talks about a distributed directory service server they appear
to be readying for market. Not that I am saying that this is
bad In fact, it's excellent, but anyone willing to guess at
what this does--in the ideal case--to the robot business?
Assume 80% market penetration and client-side integration with
the browser. <smile>)
-- </rr> Rob Raisch, The Internet Company