Re: Inter-robot Communications - Part II

John D. Pritchard (jdp@cs.columbia.edu)
Tue, 02 Jan 1996 15:28:06 -0500


hi,

i think some of the issues coming up could be resolved with a another
approach to the protocol problem. i think that negotiated protocols would
be one of the most important aspects of a robot to robot link so that new
protocols may be created or old ones extended at will.

in the simplest approach, protocols are named and versioned, which ident
strings can be used for communication set up. this presumes a basic subset
of all robot protocols, which would be stateless, of course, for
transferring the version string. higher level protocols may want to be
stateful, as some search engines/web sites are now for narrow-casting, or
for caching. (www.sony.com: the magic cookies it puts into its forms
expire, or at least appear to "expire", they're having lots of problems
lately)

more interestingly, a minimal (stateless) inter-robot language would also
provide for more sophisticated forms of negotiation. for example, systems
with dynamic local indexing, eg, expiring magic cookies, could negotiate an
expired context with previous search info which effectively restores the
lost context. in such an environment, caching info is predetermined. for
example, saving GET strings with search info.

this meets the ideas raised by martijn. as mentioned there are lots of
ways to do this, which can be negotiated to some degree. downloading a
Java or Safe-perl program is one way, and could be a subset of the proposed
protocol, since various people would have various ideas on how to do this.
for example, what kind of namespace the script enters, or are there a class
of scripts with particular init arguments.

these things provide for narrow casting techniques which would be valuable
for "client robots", intelligent agents.

so, ive presented a radical view which would tend to promote imaginations
to prefer just downloading scripts to some anarchy of protocol extensions.
but no one has to support any extension one doesnt want to. on the
otherhand, with more and more Java browsers, it's possible to download Java
code (protocols or protocol extensions) into clients, providing a means for
such anarchy.

so, can we provide a framework for this kind of environment? i think most
of it is already provided via HTTP and Java. the merging of these things
under a common umbrella just serves to solve concurrency problems like
caching and promote robot interoperability, like a more open than opendoc,
ie, corba, http://www.cilabs.org/ approach to robots.

another approach would have negotiated semantics, ie, lay out a protocol
for doing everything you ever want to do. a whole new language.

i think protocol ident strings are useful as a family identifier, but the
string approach would require something more, maybe a concatenation of
extension ident strings, for identifying extensions. this is the
deterministic perspective. a one-degree less deterministic perspective
would have extensions negotiated as methods required, as required.

the idea of "robustness in failures" as a general model of error catching
and zero or more (degrees of) adaptation in communication can make a mess
of intended versus actual semantics if applied to a protocol namespace.
however, it's required anyway so the question is only to what degree it is
a part of the architecture. von neumann permits it to be a principal
component of his self-reproducing automata. this is the nondeterministic
perspective.

-john