<DISCLAIMER>
Thought I'd toss out a few thoughts and see if they resonate with
anyone. Rohit asked me in private e-mail a while back for my
perspectives on my / Activerse's and others' efforts to standardize an
event-notification protocol, specifically for presence information
within the IETF process. While I don't want to pick apart that
particular effort too much right now, I thought I might muse a little
bit about the standards process in general and a few other things. This
is kind of high-level and stream of consciousness, so no guarantees at
all that it's particularly relevant or insightful. And I'm absolutely
certain that little if any of this is of any strategic consequence
whatsoever. Finally, apologies for the sparse refs, I'm feeling lazy
thisafternoon.
</DISCLAIMER>
It's clear to me that the IETF is collapsing under its own (growing)
weight. With the involvement of so many big players with large, heavily
dollar-motivated agendas, it seems clear to me that the IETF is moving
away from being motivated to actually achieve rough consensus and moving
more towards serving primarily as a battleground for commercial turf
wars and a venue for ego and PR-motivated posturing. I sometimes think
that some of the parties involved have as their primary agenda keeping
things delayed in process and confused long enough to establish defacto
standards.
Where "progress" does appear to be happening, I'm not sure the results
are all that positive. HTTP, for instance, seems to be becoming the
world's trashcan. Everybody and their dog seems to want to throw new
methods and semantics into HTTP. Why? The resulting mess is going to
be scary as hell. The "right" technical solution is to produce unique
application protocols that sit directly on top of TCP and UDP, with
appropriate state machines, semantics, etc... but it's just too damn
difficult to get any commercial momentum outside of HTTP, there's just
too much inertia. To say nothing of the inertia within the IETF towards
these efforts. Firewalls and other concerns make it very difficult to
succeed outside of the existing protocols, so we try to extend HTTP and
/ or tunnel other protocols through it. Fine, leverage HTTP in some
way. (See our own Mark Day's [1] for a treatment of the problem of HTTP
envy in the context of presence protocols.) But adding methods doesn't
seem to be the right thing to do, does it? HTTP should remain a simple
request-response protocol; all the other crap we want "it" to do
(access control, authentication, DAV-like stuff, event notification)
really just pollutes the protocol space and ought to live at a higher
level of abstraction if it simply has to be tied to HTTP. (And by the
way, I'm re-convinced that it does have to be tied to HTTP. Let's face
it: HTTP is the new transport layer.)
Everybody talks up the notion of an "object web." The CORBA folks want
to position their wares as the solution, but CORBA is far outside of the
spirit of an "Internet" technology in many ways. IIOP is not a
particularly pretty little protocol. Further, IIOP stands outside of
the sphere of most other services and abstractions that our other
familiar protocols and datamodels build on.
Maybe XML can help somehow. After all, at some level, XML is a sort of
metamodel for arbitrary datamodels, as well as a marshalling format.
Presumably everyone is off busy as beavers creating DTDs for every
interesting application data model under the sun. Now we get into the
same kind of standards bickering and so forth while trying to agree on
schemas as we had in the protocol space. And it doesn't address the
protocol aspects at all, it just gives us tools for addressing them. At
the end of the day, we're still in a mess, possibly a bigger one ---
we've got a zillion different DTDs, a bunch of versions of each, etc.,
and haven't dealt with protocol issues at all. We *still* really want
to be at a higher level of abstraction, dealing with object graphs and
interfaces and method invocations in an XML-ish and HTTP-friendly way.
I took another look at XML-RPC [2] a week or so ago, and revised my
opinion on it. I'd sort of dismissed it for various reasons some time
ago. But there are really the seeds of some good ideas in there! It
adds no methods to HTTP in its current implementation, though it could
benefit from one (INVOKE.) It's really just a simple remote procedure
call / return protocol sandwiched in under POST, with a standard set of
representations of primitive datatypes and an XMLish way to marshall and
pass data structures. Very cool!
Problem is, it's still not at a high enough level of abstraction ---
i.e., objects and graphs of objects --- and it doesn't provide the kinds
of support infrastructure we'd want, like first-class interface
definitions to support interface discovery etc. The basic concept could
easily be extended in that direction. It seems to me that if somebody
produced a "generic object model / marshalling format" DTD and and
"object interface definition" DTD for XML which was typewise-congruent
to the IDL typeset, it would be of tremendous utility and strategic
importance. In conjunction with the invocation / return protocol under
POST or in a new INVOKE method, we'd have most of the building blocks
for all of the things we want to do, and do at a high level. (We'd
still have a large amount of infrastructure to build, but we'd have the
fundamental building blocks to get the job done and we's have a template
/ lessons learned in CORBA.) And we'd largely avoid the morass of
standards processes. Though we'd still eventually have to agree on
object models and interfaces for common applications, in my experience
it's easier to agree on those things than on protocol issues. Finally,
the resulting solution (it seems to me) would simplify implementation
dramatically. The result would be a human-friendly, Web-centric
delivery of all the wonderful stuff that CORBA was supposed to provide
for us. A good thing, surely?
Just some thoughts,
jb
[1] http://search.ietf.org/internet-drafts/draft-day-envy-00.txt
[2] http://www.xmlrpc.com/