Re: Standard?

John D. Pritchard (jdp@cs.columbia.edu)
Mon, 02 Dec 1996 14:35:53 -0500


Nick said...

> Isn't it much more in the spirit of cooperation on the net to encourage
> development of technologies that solve the problem, instead of hoping that
> the law will do so?
>
> There is tremendous economic incentive for companies to develop smarter and
> smarter e-mail filtering to cope with the junk mail; we're just in a
> temporary gap when the filtering agents haven't caught up with the brute
> force of junk e-mail. A law that cut off the flow would stop the beneficial
> unsolicited e-mail (there are such things!), which one might imagine is the
> baby that's in the bathwater. (To stretch the metaphor unduly, one might
> argue that the baby is drowning in water polluted by spam, but ...)
>
> To bring us back to robots, I think that the implication is that any system
> of making information available for robots and other agents should assume
> that they are going to be able to utilize increasingly sophisticated
> filtering data.

these agents will work on all content types across multiple protocol
ranges, including HTTP and NNTP resources, via directory services like
search engines.

junkie-mail is filtered out

the disproportionate marketing value ratio between junkie-mail or narrow
casting by producer drive, and narrow casting via user agent (consumer
drive-motivation) means that the software market for user agents that
filter mail and user agents that support narrow casting is much deeper than
the market for producer agents that spam.

or at least this is the two sided media battle that will ensue for producer
and consumer agent products. in the market for producer software,
advertisers want agents that advertise (narrow casting and spamming), and
the consumer market want agents to filter mail and discover or filter
internet resources. you pick how you think it'll go.

-john

_________________________________________________
This messages was sent by the robots mailing list. To unsubscribe, send mail
to robots-request@webcrawler.com with the word "unsubscribe" in the body.
For more info see http://info.webcrawler.com/mak/projects/robots/robots.html