Web Reliability [was: Session-id and proxies (was: Re: session-id redux) ]

Daniel W. Connolly ([email protected])
Wed, 26 Jul 1995 18:05:29 -0400


In message <[email protected]>, Darren New writes:
>
>But you see, this is a real and true problem, and not something you can
>shrug off. Let's say ...
> Guess who has to deal with
>that customer.
[...]
>A solution that works only when everything goes well isn't going to fly
>on the Internet. ;-)

I'm not sure what you're asking for here: do you mean that everything
has to work even when nobody is playing by the rules? Surely not.

If every possible interaction of conforming implementations of a spec
produces reliable behaviour, then that's about the best you can ask of
a spec.

The problem with web reliability is that we've got incomplete specs,
(i.e. there are interactions between implementations that comply with
everything that's in the spec, and things go haywire anyway.) Plus,
we've got implementations with no specs at all.

I'm not talking about standards, necessarily. A standard, by nature,
takes time to develop. But everybody playing this game should write
up some spec for what they're doing before they do it. They should
make a best effort to make the spec complete, and they should sumbit
the spec for review and possible standardization. This is especially
critical for features that are not "private agreements," i.e. they
have interactions with parties outside the administrative domain
of the implementor.

Lord knows we've got a lot of catching up to do in order to get
to the point where there's a good spec for all the web technology.

But catering to every existing implementation in the mean time will
make this process take _even longer_. We have to draw the line
somewhere. I like to draw the line at the point where an
implementation goes against a spec that was available at the time of
implementation. There are exceptions, of course, but that's the
general rule, if you ask me.

There's lots of broken web technology right now. But most existing
web technology -- broken or not -- will be replaced in the next year
or two or five. I'd prefer to keep things simple so that the web
will be much more reliable by then, rather than make exceptions
for every broken piece of software in existence. "Bugward compatibility"
is incredibly expensive -- it's what's creating many of the
existing support nightmares, if you ask me.

For example, at least three engineering organizations have reported
that they developed an HTML parser first as per the spec, and then
adding compatibility hacks. When they found out the state of HTML
that's really on the web, they reverse-engineered the Mosaic 2.4
parser. Each orgainization reported that reverse-engineering
the Mosaic parser took twice as long as implementing the original
SGML-based parser, even with all the screweyness of SGML.

So the HTML spec is littered with "notes" that are essentially
documentation on the Mosaic parser. But the HTML language hasn't
changed as a result. There's an up-hill battle ahead, but I actually
expect HTML on the web to become conformant over the next year or
so, because in the end, it's worth it for everybody involved.

Daniel W. Connolly "We believe in the interconnectedness of all things"
Research Associate, MIT/W3C PGP: EDF8 A8E4 F3BB 0F3C FD1B 7BE0 716C FF21
<[email protected]> http://www.w3.org/hypertext/WWW/People/Connolly