The Mythology of Steve Jobs

The deification of Steve Jobs is a truly remarkable sociological phenomenon. There has been good sociological commentary on this already, including a post by Kieran Healy applying a Weberian analysis of charismatic authority to Jobs and a post by Teppo Felin on the social construction of Steve Jobs (also see a post by Shamus Khan on the Foxconn sweatshops that make Apple products).

What I want to add here is an argument that not only is the exaltation of Jobs explicable as a reaffirmation of the American mythology of individualism and free markets, but, more provocatively, the Jobs-as-Great-Man narrative is wrong in assigning so much responsibly for Apple’s ostensibly-trailblazing products to a single individual. Against both the American mythology and mainstream economics, technological innovation is better conceived as a collective endeavor.

It is not hard to find mainstream commentators trumpeting the mythology of Steve Jobs. Witness Joe Nocera in a recent New York Times column, complacently chanting the uplifting mantra that Jobs’ accomplishments range “from starting the personal computer industry in his garage to creating a half-dozen of the most iconic consumer products ever invented.”

So much nonsense. I want to propose here a more irreverent interpretation of Jobs’ legacy by making a counterfactual argument: If Steve Jobs had never been born, we would still have products remarkably similar (in functionality, user-friendliness and coolness) to the iPod, iPhone and iPad. Let me elaborate.

The notion of the Great Man has been central to American ideology for many decades. Indeed, sociologist Orrin Klapp wrote about it in an article on “Hero Worship in America” in a 1949 issue of the American Sociological Review. Klapp noted that cultural heroes often generate curiosity about themselves by remaining distant from the public, generating a shroud of ignorance that provides the basis for the formation of legends.

Part of this carefully-controlled image was of a man who micromanaged every aspect of his business. Much of this may be true. But he simply could not have done it all himself.

Apple currently has around 47,000 full-time equivalent employees and an additional 2,800 full-time equivalent temporary employees and contractors. While it is unclear how many of these work in R&D, Apple spent $1.8 billion on R&D in 2010.

Certainly Jobs was a brilliant designer, and iPhones and iPads remain the coolest of their class. But the simple fact is that these products resulted from massive amounts of research and development by an unknown number of researchers and developers working in teams, building on existing technologies that were the result of the collective efforts of tens, perhaps hundreds of thousands of people working in the electronics and telephone industries for decades.

Taking a longer, more sociological view, it appears that Jobs caught a massive wave of ongoing technological innovation at just the right time. Sure, his team added important aspects of style and investment to the products emerging from the technical tsunami. But there were MP3 players before the iPod, and the idea for a smart phone has been around at least since Gene Roddenberry’s original Star Trek series, with later Star Trek series introducing versions of e-readers and tablet computers. By my account, this would suggest that Jobs was not so much the Great Man who revolutionized technology as a clever designer who was in the right place at the right time to be a bit – and only a bit – ahead of the curve.

The sociological upshot is that technological innovation is generally the outcome of the collective labor of many individuals working together in teams, and in social contexts that foster the dissemination of ideas (as AnnaLee Saxenian argued in her book Regional Advantage on Silicon Valley). And, moreover, innovation generally happens not because but in spite of market forces. As Richard Lester and Michael Piore have shown in their book on innovation, critical technical innovations tend to be produced when groups of individuals are able to work together in places that are protected from market forces. The canonical examples are the corporate labs of General Electric, Xerox, IBM or, indeed, Apple and Google, which are able to invest in basic long-term R&D.

If the Mac came out of Job’s garage, it is the exception that proves the rule. It did not “start the personal computer industry.” In 1968, while Jobs was still in high school, Hewlett-Packard introduced its “911A personal computer.” In 1973 the “Alto workstation computer” was debuted at Xerox’s Palo Alto Research Center. That same year, the Micral-N microcomputer, the first non-kit commercial computer with a microprocessor went to market.

It was not until 1976 that Jobs and Stephen Wozniak, the actual engineer, released a personal computer to feature a graphical user interface and a mouse, but both the GUI and the mouse were developed by Xerox at its Palo Alto Research Center, sheltered from market competition in a completely different world from Steve Jobs’ garage. Moving forward 25 years, it took billions in investment to produce the iPad. If it wasn’t Apple that produced the first viable smart phone and tablet computer, it would have been some other multinational corporation with deep enough pockets to fund the research.