Re: COMPRESSION

Tony Godshall ([email protected])
Tue, 30 May 1995 19:13:18 +0000


> > All this outputs normal text... These files can be zipped by GZIP
> > to get basically the same compression ratio as the original VRML file
> > (this is an estimate from my experiments)

Are we are speaking of the tokenized files? ASCII should compress
more than the [proposed] tokenized or [proprietary] binary Inventor.

> I think compression schemes like this have the benefit of small size with
> fast parsing. But if parsing speed is not an issue using gzip gives
> excellent results and works without having to define a new format.

I think the ratio of parsing cycles to rendering cycles will lean
heavily toward rendering as our primary local machine bottleneck, and
network transfer as our primary network bottleneck (at least for
those of us using modems). Decompression can be relatively
CPU-efficient with the right algorithms.

Parsers will have to continue to handle the ASCII variant if we are
to remain backwards-compatible, so why add this second level of
complexity. A compression wrapper will take an ASCII space or a
tokenized space down to similar file sizes- why do both?

One remaining issue re compression, though, is that it would be nice
to decompress on the fly, and render the components as they come in.
Thus the browsers ought to have the decompression algorithms built-in.