Re: Mitsubishi Demo in IEEE Computer

John W. Barrus ([email protected])
Thu, 22 Dec 1994 08:49:54 -0500


Mike Roberts said:
>Yes; another article is there and is probably worth looking at to round out
>stuff, it gives little real technical detail; certainly not the hairy stuff
>concerning the world modeling, synching, and underlying architecture I would
>have liked to have seen. It has less pretty pictures; a subset of the full(er)
>sequence which is in Computer. MultiMedia arrived with me first; I was getting
>all excited about seeing Computer, and kinda sad when it arrived and I didn't
>get the detail there I wanted to see (though computer did have more pictures -
>yummy ;) ).
>

It was very difficult to write enough detail about such a large system in
the space allotted to the articles. If you have specific questions, please
ask me and if I can't answer them myself, I will pass them on to the
appropriate person here at MERL.

>> Worth looking at in color - do not work too hard to see a B&W copy.
>
>It's certainly worth looking at. If it works as they are claiming it is indeed
>impressive - and nails the interactivity aspect very well. Most impressive for
>me was that they are creating "performance" scripts for 'bots using parsed
>voice
>input ... in essence speaking to the 'bots within the environment, and
>modifying
>their behavior.

Mike (the robot) has a set of rules about what to do when the world model
is in a particular state. For instance, if you enter a room where Mike is
located, he will come up to you and say "Hello". He also has a routine
that the interactors can discuss with him, including asking him to perform
the routine and suggesting changes. By speaking into a microphone, you can
ask Mike to modify the routine.

The underlying, physically based simulation (control system and physical
model) of the two-legged robot ran at about 1kHz and ran on a separate
machine (the simulation was from Boston Dynamics Inc and Marc Raibert).
Mikes 'brain' could understand spoken language, translate it to text, parse
it, and change internal state or perform actions based on what was
'understood'. He could also respond using DECTalk. All communication
between Mike and the interactors was verbal.

>The world views appear to be synchronised by broadcasting fairly
>comprehensive world-state info at about 10 fps (their absolute minimal
>acceptable frame rate), an approach which I think would not work so well in a
>widely distributed environment with a lot of latency, or one without a whole
>bunch 'o bandwidth .. but which works fine across their lan.

The system described in the article is not very scalable. We think that
scalability is essential and our future output will confirm that belief.
In a few months we will have the first version of a new system that
addresses the issue of scalability directly.

>This stuff looks like it comes directly from the simulation/mud end of things,
>with interactivity and simulation clearly the driving force in the development
>program.

Simulation and interactivity were both important factors in developing the
system. Interaction with software agents was also an important
consideration.

>They do not claim to be VR, instead, they claim to be a distributed
>multimedia environment.
>
>I was very impressed ...
>
>-- Mike

If you are interested in reprints of the articles, please send e-mail to
[email protected] with your mailing address and a specific request for a
reprint of either or both articles. We won't get the reprints for a few
weeks so don't expect them right away.

John B.

-------------------------

John Barrus Research Scientist

Mitsubishi Electric Research Laboratories, Inc. voice 1.617.621.7535
201 Broadway fax 1.617.621.7550
Cambridge, MA 02139 [email protected]