<Soapbox>
HTTP/1.0 is a connection-free model and I think it should stay that way; it
works, don't break it. HTTP/2.0 or whatever we end up calling it, can
support connections, encryption of headers, negotiation of Accepts,
multiple GETs, etc., but I don't think we should *break* what we have today
with HTTP/1.0. In fact, what is needed most is to get the HTTP/1.0
specification out of Internet-Draft status and make it an RFC so we all
have two feet to stand on rather than the "well, that's what X Mosaic
does." argument we so often hear. I do not want to deviate completely from
our de facto standards today, rather I want some standards to point to in
the form of an RFC, informational or otherwise. To accomplish this I think
we have to move some of the features described in the current online HTTP
specification to the HTTP/2.0 specification. Then we need to clear up
(edit) some of the language in the current specification to remove
ambiguities. This should not take a long time, in fact, by the July IETF or
the Fall IETF at the latest the specification should be set in stone and
everybody will be coding to the same thing.
This argument applies to HTML as well, with the difference that I think
FORMs as currently implemented should be added to what I would call HTML
1.0. SGML compliance may or may not be possible with HTML as it is today,
you would certainly have to be lenient in many cases. Alos, it is important
to not delay the HTML RFC to include HTML+, which should be a separate RFC.
I am completely willing to back off from adding new features and toss out
unimplemented features in order to get these RFCs done. However, I will not
support an ongoing munge of what we have today as has happened with the URL
fiasco. I've talked to a number of people offline that agree that getting
HTTP/1.0 and HTML 1.0 to RFC status is a good idea and more importantly,
doable within a short timeframe. There are even people interested in
spending time to make it happen rather than just talk about it :)
Finally, I would like to suggest that if the Web is at all about
interoperability and ubiquity then the architectural foundation has to be
finished and soon or there will be many small Web communities that can't
communicate with the other ones. This is not a threat, this is a reality
coming our way. So far we've been lucky, but I've seen some of the ways
organizations and companies intend to exploit portions of the Web and if
you want to continue to get to most if not all of the content published
then we have to give publishers (this includes individuals as well)
something real to base their publishing on, not a moving target or a single
de facto implementation that changes every month.
</Soapbox>
ka
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Kevin Altis 2111 N.E. 25th.
Hillsboro, OR 97124
Media Delivery Laboratory Email: [email protected]
Intel Corporation JF2-58 Phone: 503-696-8788
Fax: 503-696-6067