Standard

Joseph Whitmore (whitmore@worldnet.att.net)
Tue, 26 Nov 1996 23:53:56 -0800


I've been following the robots mailing list since October and given the current
thread, I support HipCrime's position 100%. The robot exclusion standard, while a
nice gesture, seems to be a waste of time.

The way I see it, robots.txt is a convenience for the robot, not a method to
restrict its access. Speaking as a user, I really don't care what burden I place on
a distant server as long as I acquire the information I want. Even if "Obey Robot
Exclusion" was an option on the agent, I wouldn't use it. And why should I? What
is the penalty for not obeying? Perhaps this list needs to refocus on
positive/negative reinforcement as the solution. (People comply with the speed
limit only because they risk expensive tickets.)

Although I appreciate the technology of ActiveAgent, I doubt I would ever use it
personally. I don't see much difference between ActiveAgent and junkmail delivered
by the postoffice. Every day I have to roundfile all the junk mailings stuck in my
PO box. On the other hand, it must pay off for the advertisers because they
continue to send it. I'm sure the mailman (or sysadmin) would prefer to avoid
dealing with what I consider nuisance mail, but he doesn't have that option.

Agent technology is in its infancy and ActiveAgent is just a simple primer on the
types of agents which will be available in the not-too-distant future. In fact,
ActiveAgent will pale in comparison to future intelligent agent technology.

My real goal in subscribing to the list was a hope that we could discuss robot
technology and how it can be expanded, not limited (contextual agents, alife based
agents, filtering agents, etc.).

Perhaps I need to reemphasize the point that most programmers/users don't care about
the strain placed on webservers, nor do they care about making life easier for a
sysadmin. If I bring your server to its knees, I really don't care -- you should
have protected yourself. If you think life is difficult now, just wait until
intelligent user-agents are released to the general public.

A client of mine, a government agency, received over 3,000 1MB MIME emails to the
webpage mailto over a three-day holiday which crashed their internal mail system.
(Some hacker with a grudge spammed them.) What do you suppose the solution was?

-- Joe

Joseph Whitmore
Interim, Inc.
Washington, DC

_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html