External behaviors (was Re: MISC: Inlined Sound Support)

Mike Roberts ([email protected])
Mon, 17 Apr 95 15:49:26 EDT


I have a slightly different proposal for behaviors, which is related to the
debate on sound support, in a way.

On Apr 17, 10:56am, Brian Behlendorf wrote:

> I don't see why the scene description language and the media types
> of inlined data aren't completely orthogonal -a binding might have to be
> described, as Tony has proposed for sound, but exactly what data type
> that sound is doesn't need to be declared anywhere.

Is there any reason why behaviours cannot also be handled in this way ? (subject
to certain performance issues). I think of "code as media" anyhow, and imagine
support for multiple languages via inlined code. This code is processed
(compiled, interpreted, (played!) etc) and run inside external language engines,
just the same way as sound data is played in an sound application.

Browsers say "I can handle Java/Smalltalk-VR/Tcl-VR/MOO3.X/etc", as well as
"MIDI/.mp2/etc".

The code fragments defining sets of behaviors are embedded as in-line objects ..
and possibly an event->behavior map which specifies how user interactions in the
browser trigger behaviours in the language engine(s) attached to it.

Any external language have some way to modify the scene graphs, which can be
language specific (<Gets on soapbox> of cource, life would be much easier if
language designers had a defined API for the manipulation of the scene graph, as
well as a defined mechanism to get events back from the VRML browser <gets off
soapbox>). Obviously, any external language also has to satisfy certain security
concerns, because you don't really want potentially nasty code downloaded and
run without being fairly sure it is safe (though we do this all the time right
now !).

It would be nice to get code passed out of the VRML browser as it is downloading
(sorta like netscape's gif handling, but for code), so that the process of
initial compilation/tokenisation can be going on while the behaviors is still
downloading.

The scenairo starts to get real sexy if the browser can route messages from the
interaction engine to other external applications (and vice versa). One way it
can do this is by providing a mechanism to trigger "browser events". Browser
events fall into two categories. "Physical events" are clicks by users on
objects and "Logical" events. When an event occurs in the browser, it is offered
to all (or the first relevant, or any other number of message distribution
schemes) attached external engines (whatever !). The same goes for logical
events, which are generated by the external engines and don't neccessarily have
any user-interaction to them. They are just a mechanism for the external
applications to communicate amongst each other, using the browser as a clearing
house to knit the external apps into a functioning sub-system. For example, a
click could cause a behavior to evaluate in one engine, which might cause a
logical event "PlaySnd (with associated parameters) in the browser, which would
in turn get distributed to the sound application.

This I think this kind of apprach could get us a long way down the road to an
open, flexible system with behaviours, without having to "write the world" on
behaviors.

-- Mike

> Think orthogonally....