On coodination costs and human factors
- In an atypically verbose exposition, Matchmaker Roy Fielding propounded thusly:
> Don't get carried away. You won't find a constraint about "nouns"Similarly, Layer Stripper Bill de Hora recently flirted in this way:
> anywhere in my dissertation. It talks about resources, as in
> *re*sources, because that is what we want from a distributed
> hypermedia system (the ability to reuse those information sources
> through the provision of links). ... The really interesting
> services provide many resources.
> Note that this is just another way of restating Reed's law in
> relation to Metcalfe's law.
"Some people when faced when a distributed programming problem, think
'I know, I'll expose a new service interface.'
Now they have n-squared problems."
View Sourcerer Joe Gregario almost in passing shined a light on some
cobwebs (caching and authentication):
"In the process of implementing httplib2 I also discovered some rough
spots in HTTP implementations."
Putting these together and handwaving as usual...
It strikes me that the first two statements are all about changing the
frame and making economic arguments for REST, namely constraining
system design to resources, identifiers and uniform interfaces in
order to lower coordination costs... ergo the web style as an enabler
of Reed's Law.
I hadn't seen such explicit harkening to Metcalfe and Reed in the past
even though there's has always been this notion of REST as
incorporating end-to-end principles. I too have argued in this vein
about a complexity and integration argument for REST.
In a similar thread Benjamin Carlyle put it thusly "Uniformity is key.
Simplicity at the network level is key. Managing state transparently
as resources is important for its own reasons"
Intuitively, the argument about applying architectural constraints to
get payoffs in terms of leverage has a lot of appeal to engineers. It
seems however that we need some economists to weigh in here with some
options pricing theory or other to give more heft to these arguments.
Often, decisions about software systems are not made by engineers but
rather by financiers and it helps to speak their language.
Instead of Reed's notion of group forming, it seems the argument by
analogy for REST is about application integration and the barriers
that obtain in that sphere.
Now I like the large numbers that we can throw out in these
reformulations. The thing is that Andrew Odlyzko says that both
Metcalfe and Reed's law are bunk despite their evident appeal and
ability to dazzle venture capitalists.
In that paper he sets out to get quantitative measurements and hard
data and puts the value of communication networks at nlog(n) which is
nothing to sneeze at, better than Sarnoff's linearity but far less
than Metcalfe and Reed's estimates. Now Odlyzko was measuring the
utility of the internet which encompasses more than the web and you
can argue following Sam Ruby that the email and peer-to-peer styles,
whether expressed as, Bittorrent, Skype or usenet, are the true
winners in his numbers... Also you could argue with his methodology
and I have my own quibbles, but it's a brave man who takes on
Odlyzko... Thus I'll take his numbers in my handwaving, acknowledging
after all that the web was the key enabler and popularizer of the
This gets me to the third quote from Joe, namely the perennial rough
spots and implementation quirks that are our daily bread as engineers
trying to design and produce systems for the web.
We see daily abuse of HTTP and there are annoying glitches with the
libraries and implementations that exist and also with what is
deployed in the real world. Perhaps this is because REST is the web
style rather than the programming model and consequently enforces very
Now it seems to me that this is about the place where theory meets
practice and we get into the realm of pragmatism and leverage. In the
wild we see
- the difficulties of interoperation
- differing interpretations of specifications if indeed specs are read
- backward compatibility constraints e.g. for leverage and adoption,
HTTP 1.1 had to accomodate some of the pitfalls in HTTP 1.0
- difficulties in authoring structured data
- configuration and deployment issues (mime types, content negotiation etc)
- competition with other styles, the web style exists in a marketplace
- vested interests and economic models e.g. limits imposed by shared
hosting providers, assymetry of some broadband networks etc
With this in mind, I wonder if I can come up with a stab towards
Koranteng's postulates on coordination costs
1. there is a natural dampening factor in the utility of distributed computing
We can use Odlyzko's numbers as the lower bound in practice of network
effects and Reed's law as the theoretical limit (with Metcalfe being a
I happen to be reading Graham Green's The Human Factor and, looking
through some of the issues that hinder adoption, many of them could be
summarized as comprehension or human variability hence I'll
characterize the issue as the human factor. All that is left is to
augment with some Black-Scholes options thinking and finacial
derivatives to package to CEOs
2. the human factor in technology adoption is sizable and its effect
can be measured. Moreover I would argue that it should be recognized
as an explicit architectural constraint in the design on software
3. In the realm of distributed computing, this human factor is bounded
by Odlyzko's limit and Reed's law.
Mathematicians can derive the correct coefficient for me... 1 / n log(n) ?
The rest as they say is advocacy and implementation details...
We are operating with imperfect specifications, imperfect frameworks
and imperfect implementations. REST as laissez faire distributed
computing doesn't acknowledge these costs as architectural constraints
but rather seems to go about it by encouraging best practices and
hoping that by existence proof, people will come to it... If you look
at the high level requirements that have been articulated
- Simple protocols for authoring and data transfer
- textual formats for protocol and some exchanged hypermedia
- Sensitive to user-perceived latency
- Mark Baker's talk about "principled sloppiness" (i.e. "must-ignore
We don't tend to enforce much of these things in protocol. I wonder
what other best practices can lower coordination costs and if they can
be encoded in protocol to remove the human factor...
Anyway food for thought...
If anything this enables me to add Reed's insight to my growing
taxonomy of the web style which some in this list may have come
there's a tag: REST
there's a slogan: the web style
there's a Holy Book: Architectural Styles and the Design of
Network-based Software Architectures
there's a Reverend: HTTP
there's a choir: the HUHXtable quartet (HTTP, URI, HTML, XML)
there are Four Horsemen: GET, POST, PUT, DELETE
there are prophets: (you know who you are)
there are pillars: Resource Modeling, Idempotency etc
there are priests and tax collectors: the caching and other
intermediaries. Ergo "Render unto Ceasar that which is Ceasar's"
recast as the notion of "giving visibility to intermediaries".
there are angels and demons: a band of Apaches and various HTTP
libraries which are alternately sources of delight and exasperation
there's a Messiah: the browser (which comes with various pretenders,
Firefoxes, Great Explorer's, Viking Operas and Fruity Safaris)
there are red herrings: url opacity etc
there are false gods: WS-*, crusty old architectures of appropriation etc.
there's the wilderness and prodigal children: WebDAV?
there's immaculate conception: the virtuous XML
Applet, ActiveX (some discredited)
there are scrappy offspring: Atom, RSS and Atompub
there are gruesome Philistines: implementation details such as
Structured Data, Character Encoding, Security
there are elevator pitches, Cliff's notes and ballads: Sir Tim's
lullaby of Web Architecture 101 is quite reasonable
there's myrrh and frankincense: the web as conversation
and now there's the promised land: Reed's Law as the land milk and honey
Parting digression: I noticed last year that Roy produced a white
paper about JSR 170 (Java Content Repository) for his day job (pun
intended) applying principled constraints to the modeling of content
I've been curious about the surprising inertia behind that
specification having played with IBM implementations in the past few
years. Perhaps though the immaturity of that space is a tribute to
some of these arguments about coordination costs applied to the
marketplace of data (relational, object relational, XML, SQL, XPath,
XQuery, ActiveRecord, ODMA, Spring, Hibernate, SDO etc).
I wonder if the Atom store dream is the way to go, namely rather than
apply the constraint of an api and a language, Java, in a world in
which we have a Tower of Babel of languages and persistence
frameworks, it makes more sense to focus on wire protocol (as in Atom
Publishing protocol) and wire format say Atom. In other words the
greater payoff would be not in establishing a programming model (the
JCR) but rather in moving to Atompub which is agnostic on the
underlying programming model and lowers the coordination costs by
stripping a layer of comprehension from the mix. All this of course is
modulo the quirks of compound documents, media collections etc...
Anyway having written this much, it's probably worth blogging at some point...
Koranteng's Toli - http://koranteng.blogspot.com/