[BITS] Week7 readings : Modularity in standards and markets

rohit@godzilla.ICS.uci.edu
Tue, 17 Feb 1998 13:44:46 -0800


---
[As usual, the indented bits are Phil Agre's words, not mine. --RK]

Week 7 / Modularity

An emerging theme in the literature is the interaction between the structure of products and the structure of industries and firms. We will consider the specific case of modularity. Technical people tend to portray modularity as an ahistorical design norm, but this approach cannot tell us the conditions under which markets produce modular systems. We will consider the matter in both its empirical and a strategic aspects.

RK: This week, we finally get a little deeper into the internal structure of standards lattices. In practice, there are battlefields of standardization that seem really relevant at a moment in time, but later become "sedimented" in place because the battlefield has moved up a layer or aside to address a 'reverse salient' (i.e. something else is gumming up the works).

Now, Agre is right that I *do* think of modularity as a Good Thing. It's an essential engineering design rule. It doesn't occur to me as a technologist that it's even subject to market ratiocination. Even if you can't sell your product in modules, you have to design it that way to remain sane. Yet, this week's readings help eludicate a lot of latent biases about when a modular market is made visible to consumers, not just product designers: when there is market demand for many configurations; when there is market demand for niches within component-classes; and when there are multiple players in a set of complementary markets.

On the other hand, another major bias that I have going in is that modularity is more often determined by the interconnect than the market. That is, a technologically determined modularity. My automatic impulse is that stereo hi-fi components emerged as much because there was a trivial representation of the common intermediate (analog electrical audio signals) as the marketing plays. Neccessary, but not sufficient. That's why I'm more amenable to seeing modularity in power tools (common motor, gearing interconnects) than in computer hardware design or educational curricula.

Kim B. Clark, The interaction of design hierarchies and market concepts in technological evolution, Research Policy 14, 1985, pages 235-251.

Clark argues that markets mature from a fluid state toward a more rigid, standardized state in large part through the consolidation of the customers' concept of the product. Examples are drawn from cars and semiconductors.

To be sure, Clark carefully cites previous work that used the chemist's supersaturated->crystallize imagery. The Abernathy-Utterback model, I believe.

The real contribution of this paper in the context of the readings as a whole is its elucidation of a) reverse salients and b) localized technical battles. First, there are several examples of how continued refinement of technology in one part of the hierarchy (e.g. engines) can progres far enough that another component now lags behind (e.g. transmissions). The second, though, helps formalize the insight that the market often acts like a drunk looking for its keys beneath the lamppost. For a long time, the producers will focus on innovation in one aspect: memory architectures, or DTP layout, or speaker design. And then, seemingly suddenly, someone moves the lamppost and all the brightest minds in the industry are worried about graphics chips, color fidelity, and digital audio buses.

[It's also fun to take this class in parallel with King's 230, because a lot of the references to the history of technology are being covered overthere]

Carliss Y. Baldwin and Kim B. Clark, Managing in an age of modularity, Harvard Business Review 75(5), 1997, pages 84-93.

This is a relatively breezy article for managers about the competitive issues that arise as markets evolve toward modularity.

Breezy because it also overstates the applicability of the insight. At some level *every* business unit acts like a module. *Contracts* define modules: whether you call it in-stream outcourcing or modular marketing seems less relevant at this level. They illuminate the metaphor in several example industries ("driver cockpit modules" by GM for MBZ, etc), but I think could profitably focus only on producers in modular *markets* which is a stricter subset than producers with modular *processes*. Otherwise, you regress to claiming divisional corporate governance is "modular", too.

Richard N. Langlois and Paul L. Robertson, Networks and innovation in a modular system: Lessons from the microcomputer and stereo component industries, Research Policy 21(4), 1992, pages 297-313.

This is a qualitative analysis of the conditions under which complicated products such as computers and stereo systems are provided as separate modular components, and when they are provided as integrated products. They suggest that modularity is linked with horizontal and vertical disintegration, and they express cautious (and I think far too hopeful) optimism that markets tend toward modularity because of the efficiencies that disintegration brings.

[Tsk, tsk, "disintermediation" is the ISO-standard buzzword nowadays...]

This is my favorite paper of the week. It advanced an abstract model of widget-producing networks with composite widgets, but does not stray from two concrete stories at hand, either. The Hi-Fi case is the most novel, and left one question unanswered: just what *are* the standards for module interconnect (resistance of the wiring, audio level, ??) esp as compared to later buses (DAT, firewire). Isn't it relevant that the mini-DIN (german std, right?) headphone connector and "RCA jack" became critical interconnects?

[For those who haven't read it, they explain the original evolution of modular hi-fi from monolithic phonograph market by tracing the acts of niche consumers (classical hifi buffs) to actively patch in new turntables into their radio amps, adopting new LP formats and so on. Hi-Fi lovers couldn't marshall individual acts to create a market for FM radio, though -- that took much longer to adopt. I'll skip the usual paean to PC architecture]

"experimentation is amuch more important concern than coordination. And trial-and-error learningn is one forte of a decentralized network. "

Nicholas Economides and Steven C. Salop, Competition and integration among complements, and network market structure, Journal of Industrial Economics 40(1), 1992, pages 105-123.

Economides and Salop provide a mathematical model of complementary products in network markets. The remarkably difficult question is when the products are provided as an integrated unit by the same firm and when they are provided independently in the marketplace.

Normally, I chauvinistically reserve the best paper of the week for the economist's math, but this wasn't as illuminating as I'd hoped. Or perhaps it's just the cross-partials.

They are exploring variants of the Cournot complementary duopoly result -- a single brass monopolist will offer a lower price than a world with zinc and copper monopolists gouging each other at the intermediate stage. Of course, the brass monopolist is still making a surplus over optimal consumer welfare, but he's squeezed out the middleman. From that vast oversimplification we go on to a 2x2 market with two producers of each kind of component and four resultant products. Consider what the returns are when producers of each component type merge (vertical), or of each finished product ("composite good") (horizontal).

Results: prices are lower when complements merge (zinc1 with copper1). Everybody merging may or may not, though. There are also lower prices if one component-market has the upper hand and can set prices (i.e. only one side is a monopoly).

One thing they left for future work was the antitrust implications of tying: what if there is price discrimination from brass1 between people who buy the composite good or zinc1 or brass1 individually. (Why do large OEMs appear to get Windows + Office for less than other OEMs get Windows alone -- is the marginal price of a $600 office suite really $4-$8 above Win95 in an open market?)

Marc H. Meyer and Alvin P. Lehnerd, The Power of Product Platforms: Building Value and Cost Leadership, New York: Free Press, 1997. Chapter 2: Managing Product Platforms.

A product platform is a common core for a whole family of related products. By defining a set of in-house standards, the platform permits design and manufacturing costs to be shared among several products. This chapter discusses the interaction between the structure of a product family and the structure of the market spaces that the various products will address.

This chapter is mainly the case of Hewlett-Packards' entry into the low-end inkjet market. They decided to create a common platform for economies of scale: a single ink cartridge, common firmware, common mechanicals, and so on. Later, evolution generated new platforms by incrementalism (price shaving) and by feature-enhancement (speed, page description language, ink).

Subversive thought: it could just as easily be argued that the inkjet project was an argument against modularity: that the broad R&D required a large company with a common interest and high internal bandwidth: that the "modular" plan they had for platforms could not have been devolved to the market by asking for "modular" suppliers to develop the requisite innovations. That, in short, the economies of scale and of scope required to make a <$500 inkjet possible *at all* argues for *integration*. So, modularity does not always imply disintermediation. Look at Office :-)