Has anyone here looked at the 'TimeWarp' project done at the
University of Bath, UK ?
This project involved 100's of objects ( tanks ) running on 10's of
computers for simulation. Some of their experience may be useful.
They used the concept of a 'local' time and of a 'global' time.
Their stuff was non-interactive, so I think that will complicate
matters, but the situations which were being addressed sound the same.
Also, the Hyper-G people talked at INET'95 about an algorithm for
distributing updates efficiently to 10000's of servers. It was
called p-flood. ( if you are interested let me know and I'll
dig out the url's for it. ) It had the advantage of being
update speed/network load tunable, and had some very fast 95%
update times.
I'm not sure if either of these is directly usable, but
both are certainly worth a look.
[ Personal note : I like the 'engine'/brain concept, but I think
engines should definately be completely strict function-of-time ]
I'll shut up again now.
Joel
----
[email protected] | [email protected]
Software Engineer, Educational Technolgy Service
University of Bristol, UK