I recommend that browsers provide some flexibility so that users &
providers aren't constrained. Example: most WWW browsers will
decompress *.Z on the fly. It would be good to handle *.gz
decompression identically. It might be nice if other compression
formats were definable using .mailcap extensions, but too many
standards can be hard for everyone to keep working on all architectures.
Finally it would be bad to attempt enforcing any exclusive scheme
(such as 'everyone must always compress' or 'compress on demand').
Gavin Bell & others have shown repeatedly how file size is approximately equal
for binary and compressed text files. This is pretty intuitive
since the information content is identical and there are few wasted bits.
Since V.42bis compression is in hardware on the fly, you ought not see
a performance penalty on precompressed files. Finally, key recurring
themes in network performance are (a) be inclusive rather than exclusive,
(b) keep it simple, and (c) evaluation of running code is what counts,
theory often falls flat when confronted by actual results [so see (a) again].
Recommendation summary: support at least ASCII and ASCII.Z formats to
be consistent with VRML standard and current web practice.
Other formats (such as binary) might be feasible for widespread
implementation but will differ very little in effective throughput.
all the best, Don
-- Don Brutzman Naval Postgraduate School, Code UW/Br work 408.656.2149 Monterey California 93943-5000 USA fax 408.656.3679 AUV Underwater Virtual World ftp://taurus.cs.nps.navy.mil/pub/auv/auv.html