“I mean, his whole thing of knowing exactly what he’s going to say, but up on stage saying it in such a way that he is trying to make you think he’s thinking it up right then …” Gates just laughs.
Making the “1984” ad with Steve was a pirate enterprise for creative director Clow, art director Brenton Thomas, and Steve Hayden, who wrote the copy. Steve didn’t let the board see the ad until a couple of days before the Super Bowl, and they were horrified. Directed by Blade Runner’s Ridley Scott, the sixty-second spot features a lone woman, in color, running through a sea of gray men and women listening obediently to a huge talking head nattering threateningly from an enormous screen about the enlightened potential of absolute conformity. As the ad nears its end, the woman hurls the large hammer she’s been carrying and smashes the screen. A simple line follows: “On January 24th, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like ‘1984.’ ” Sculley got cold feet and told Chiat\Day to sell off the expensive Super Bowl ad space it had purchased.

…

Others found the experience exhilarating, but not something they’d want to repeat, and left Apple to find a less stressful employment environment. And then there was the small group of folks who loved it so much they stuck around, ready to do whatever it would take all over again, in order to work in the rarefied, exhilarating, and charged atmosphere that Steve created when he was running the show. When the job was over, Steve had the signatures of the forty-six key players on the team engraved on the inside of every Mac. Even people working on the Apple II found Steve’s performance inspiring. “We used to say that the Mac people had God on their side,” said one only half jokingly.
THE DEBUT OF the Macintosh established Steve as a master showman. Between the famous “1984” ad, which played just once, during the Super Bowl broadcast on January 22, 1984, and the Mac’s official presentation at the Flint Auditorium on the campus of Cupertino’s De Anza College on January 24, 1984, Steve transformed expectations of what a product introduction could be.

…

NOW STEVE FACED the challenge of delivering on this promise within the gnawing confines of Apple. It would be a staggeringly ambitious project—one that no one at Apple but Steve could have imagined, and one that no one but he could have made so maddeningly complicated. The long road had many detours and would be pockmarked with collateral damage, but it would eventually lead to the introduction of the Macintosh computer in 1984.
After that visit to Xerox PARC, Steve completed what had been a slow abandonment of the Apple III development. The more he realized that the machine was simply a modest renovation of the Apple II, the more his attention wandered. Now he turned away completely, with the intention of applying what he’d learned at PARC to another computer already under development at Apple. This machine was specifically designed for Fortune 500 companies that required heavy-duty networked computing to accomplish tasks that were significantly more data-intensive than anything that could be handled by the Apple II or even the Apple III.

“So I had this prejudice that computers were things that stapled you and punched you,” Leary recalled.72
The military-run, prohibitively expensive, all-controlling IBM supercomputer was the epitome of both big business and big government. IBM was “Big Brother,” as Leary saw it. This lingering image is what Apple mocked with such ingenuity in its famous one-minute 1984Super Bowl ad, directed by Ridley Scott. An athletic blonde woman in T-shirt and shorts is seen charging past storm troopers, right into the heart of power, carrying that most iconic of tools, a sledgehammer. Then she throws the hammer, smashing the oppressor’s larger-than-life image: “On January 24th, Apple Computer will introduce Macintosh,” the video concludes, “and you’ll see why 1984 won’t be like ‘1984.’”73 The personal computer had become the ultimate power tool of liberation.
Leary purchased his first personal computer in early 1983. “I’ve learned so much about drugs and the brain in the last six months from working with a personal computer,” he told the audience at the Julia Morgan Theatre in Berkeley in July 1983.

For Christmas that year, Gibson finally bought an Apple II at a discount. The machine’s successor model, the Macintosh, had been launched so effectively nearly one year earlier with the legendary cyberpunk ad “1984,” but the older Apple II was still a best-selling device.
When Gibson booted up the machine at home and got ready to use it, he was shocked by the computer’s mundane mechanical makeup. “Here I’d been expecting some exotic crystalline thing, a cyberspace deck or something, and what I’d gotten was something with this tiny piece of a Victorian engine in it, like an old record player.”46 The science fiction writer called up the store to complain. What was making this noise? The operator told him it was normal; the hard drive was simply spinning in the box that was the Apple II. Gibson’s ignorance about computers, he recounted, had allowed him to romanticize technology.

Adobe Systems was formed by John Warnock and Charles Geschke,
who pioneered laser printing technology at Xerox PARC in the late
1970s. In 1982, when Xerox had failed to market the technology,
Warnock and Geschke started their own company.61 Adobe grew rapidly,
supplying a software technology known as Postscript for manufacturers of
laser printers and for the AppleMacintosh. That the Macintosh was subsequently able to dominate the high-end desktop publishing market was
due largely to Adobe’s technology. By 1984, half of Adobe’s income came
from Apple royalties. By the late 1990s, however, Adobe’s Postscript technology was no longer unique; both Apple and Microsoft had developed
their own systems. Recognizing that its niche in printing software was
evaporating, Adobe made a number of strategic acquisitions in order to
diversify into desktop publishing and electronic document distribution.
Intuit was established in 1983 by Scott Cook, a former Procter &
Gamble brand manager.

…

The concept of a windows-based operating system had originated in
the 1970s at Xerox’s Palo Alto Research Center (PARC),26 where most of
the ideas now standard in a graphical user interface, including overlapping windows, pull-down menus, and point-and-click task selection by
means of a mouse, originated. The work at Xerox PARC had led to the
Xerox Star, announced in May 1981—a failure in the market, primarily
because its price ($40,000) was much too high for a personal computer.
The concept of the graphical user interface was also adopted by
Apple Computer for its Lisa computer, launched in May 1983. Though
universally regarded as a path-breaking product, the Lisa also failed in
the market because of a high price ($16,995). Apple Computer’s second
attempt—the $2,500 Macintosh, launched in January 1984—was much
more successful. The Macintosh’s unique selling point was its userfriendly interface.27 It succeeded in capturing 5–10 percent of the
personal computer market for the next decade. But because it was a
proprietary system, it never attracted as many software and hardware
suppliers as the IBM-compatible PC.

…

In early 1985, Microsoft and IBM had begun joint
development of a new operating system intended to be the long-term
replacement for MS-DOS.
Meanwhile, inside Microsoft, development of Windows continued
under its own momentum. In late 1987, Windows 2.0 was released to modest acclaim. The interface had been polished considerably, and its main
visual elements were almost indistinguishable from those of the Macintosh.
Microsoft had obtained a license from Apple Computer for Windows 1.0
but had not renogotiated it for the new release. Version 2.0 so closely emulated the “look and feel” of the Macintosh that Apple sued for copyright
infringement in March 1988. The Apple-vs.-Microsoft lawsuit consumed
many column-inches of reportage and rattled on for 3 years before a settlement in Microsoft’s favor was reached in 1991.33 So far as can be ascertained, the lawsuit was something of a sideshow that had little bearing on
Microsoft’s or any other company’s technical or marketing strategy.

As the elegant Macintosh approached its release date, Jobs got each of the forty-seven members of his team to sign their names inside the molding of the original design for the Macintosh case.46
Macintosh debuted in 1984, targeted to a family market, but it promptly fizzled Its initial failure followed a remarkably cinematic one-time-only television ad that aired during the 1984Super Bowl. Later, after the computer was repositioned, the Macintosh achieved a 10 percent market share primarily as a result of its use in desktop publishing and education. The easy learning curve of the Mac’s intuitive GUI desktop made it ideal for use in the classroom among first- ime student users who knew nothing about operating systems or command lines. Buying only one Mac per classroom made the computer a very affordable tool, and in this way Jobs’ and Raskins’ invention accessed and influ nced an entire generation.

…

The next year, Jobs lost control of the Lisa project at Apple, and after a bitter corporate battle he was reassigned to administer another project—a less powerful computer whose design originated with Apple architect Jef Raskin. Raskin had worked at SRI in the early 1970s when Engelbart’s group was still focusing on problems with the “man and computer” interface. At SRI, Raskin also had extensive contact with the PARC personnel. Jobs now insisted that Raskin’s new desktop computer should have features that were not in the original design, including Engelbart’s mouse.43
Before he left Apple in 1982, Raskin named the new computer after a favorite variety of apple that grew abundantly in the hills around Cupertino. Raskin’s name was really a pun, of course. To most buyers, a Mac computer was a variety of Apple computer, just as a Macintosh was a kind of apple.

…

The machine’s excellent performance relied on more than a megabyte of memory to run an elegant new operating system. Macintosh’s designers pilfered Lisa’s OS and rewrote it in greatly reduced machine code so that it would fi onto a single chip. The Macintosh project was cocooned in a separate building over which Jobs himself hoisted a pirate fla . John Sculley remembered that “Steve’s ‘pirates’ were a handpicked pack of the most brilliant mavericks inside and outside Apple. Their mission . . .was to blow people’s minds and overturn standards . . . The pirates ransacked the company for ideas, parts, and design plans.”45
Lisa turned out to be exactly the overpriced marketing disaster Jobs had predicted. Macintosh quickly became the company’s only hope of survival, and Jobs regained enough influ nce within Apple to install John Sculley as CEO by late summer. As the elegant Macintosh approached its release date, Jobs got each of the forty-seven members of his team to sign their names inside the molding of the original design for the Macintosh case.46
Macintosh debuted in 1984, targeted to a family market, but it promptly fizzled Its initial failure followed a remarkably cinematic one-time-only television ad that aired during the 1984 Super Bowl.

(Long obsolete, those original Macintoshes are now much sought after by collectors.)
In early 1983, with less than a year to go before the Macintosh launch, Jobs persuaded Sculley to become CEO of Apple. This was seen by commentators as a curious choice because the forty-year-old Sculley had achieved national prominence by masterminding the relaunch of Pepsi-Cola against Coca-Cola in the late 1970s. But behind the move lay Jobs’s vision of the computer as a consumer appliance that needed consumer marketing.
In what was one of the most memorable advertising campaigns of the 1980s, Apple produced a spectacular television advertisement that was broadcast during the Super Bowl on 22 January 1984:
Apple Computer was about to introduce its Macintosh computer to the world, and the commercial was intended to stir up anticipation for the big event. It showed a roomful of gaunt, zombie-like workers with shaved heads, dressed in pajamas like those worn by concentration camp prisoners, watching a huge viewing screen as Big Brother intoned about the great accomplishments of the computer age.

…

Microsoft, which came late to the PDA/smartphone platform business by licensing Windows-based mobile operating systems, had some success in the enterprise market before smartphones became consumer oriented and the touchscreen-based Apple iOS and Android systems rose to dominance.
While Apple’sMacintosh was a technical success at its launch in 1984, it helped Microsoft far more than Apple itself (by showing the dominant operating-system company the way to a user-friendly graphics-based operating system). Apple Computer was struggling as a company in the mid-1980s, and co-founder and Macintosh team leader Steve Jobs lost a boardroom battle, was isolated from Apple’s management, and elected to resign from the firm. In 1985 Jobs formed NeXT, a computer platform development company focused on the educational and business markets. NeXT acquired the small computer graphics division of Lucasfilms, which it later spun-off as Pixar—the IPO made Jobs a billionaire.

…

Suddenly, a tanned and beautiful young woman wearing bright red track clothes sprinted into the room and hurled a sledgehammer into the screen, which exploded into blackness. Then a message appeared: “On January 24, Apple Computer will introduce the Macintosh. And you’ll see why 1984 won’t be like 1984.”
Apple ran the commercial just once, but over the following weeks it was replayed on dozens of news and talk shows. The message was reinforced by a blaze of publicity eventually costing $15 million. There were full-page newspaper advertisements and twenty-page copy inserts in glossy magazines targeted at high-income readers.
Although priced at $2,500—only 15 percent of the cost of the Lisa—sales of the Macintosh after the first flush of enthusiasm were disappointing. Much of the hope for the Macintosh had been that it would take off as a consumer appliance, but it never did. Sculley realized that he had been misled by Jobs and that the consumer appliance idea was ill-conceived:
People weren’t about to buy $2,000 computers to play a video game, balance a checkbook, or file gourmet recipes as some suggested.

April 27: Xerox unveils the Star workstation, the commercial offspring of the Alto and other PARC technology, at a Chicago trade show to wide acclaim.
August 24: IBM unveils the Personal Computer, forever altering the commercial landscape of office computing and making the Star obsolete.
May: Apple introduces the Lisa, a personal computer with a graphical interface based on principles developed at PARC.
September 19: Bob Taylor resigns from PARC under pressure. Within a few months many of the center’s top computer engineers and scientists will resign in sympathy.
January: Apple introduces the Macintosh, the popular successor to the Lisa and the most influential embodi-ment of the PARC personal computer, with a striking
“1984”-style television commercial during the Super Bowl.
INTRODUCTION
The Time Machine
It was April in California’s Santa Clara Valley, a fine time to be changing the world.
Very late one night in 1973 a small group assembled inside the office of an electronics engineer named Charles P.

…

That sign appeared one day in 1979, when a Silicon Valley legend in the making walked through PARC’s front door.
CHAPTER 23
Steve Jobs Gets His Show
and Tell
Thus we come to Steven P. Jobs.
The Apple Computer cofounder’s visit to PARC, from which he reputedly spirited off the ideas that later made the AppleMacintosh famous, is one of the foundation legends of personal computing, as replete with drama and consequence as the story of David and Goliath or the fable of the mouse and the lion with an injured paw. It holds enough material to serve the mythmaking of not one corporation but two, Xerox and Apple. If one seeks proof of its importance, one need look no further than the fact that to this date no two people involved in the episode recollect it quite the same way.
For a chronicler of PARC this presents a unique difficulty.

…

In an unexpected burst of proprietary pride, Xerox turned him down. (The company had already divested its equity in Apple, thus missing out on the computer company’s extraordinary run-up in value at the time of its 1980 initial public stock offering.) Steve Jobs made his offer instead to Tesler, one of Smalltalk’s developers.
Heeding the mysterious tarot, Tesler accepted the job that April.
He would go on to head the Lisa user interface team and to help design the Macintosh, eventually rising to the position of Apple’s chief scientist. The sign he was waiting for had come. PARC’s elitism had begun to seem threadbare, and even a little reactionary.
“I remember once I said to Bob Taylor, ‘You know, I’ve been going to these Homebrew computer meetings and I’ve been talking to people at Apple and hanging out in the personal computer scene.
There’s a lot of smart people out there who are going to run way ahead of PARC in PCs.

As early as 1972, Brand had suggested that
computers might become a new LSD, a new small technology that could be
used to open minds and reform society. During the Super Bowl of 1984,
Apple Computer introduced its Macintosh with a like-minded suggestion.
Its mouse and monitor might have ﬁrst been designed in research institutes
funded by the Defense Department, but in the ad, a lithe blonde woman in
a track suit raced up a theater aisle through row after row of gray-suited
workers and threw a hammer into the maw of Big Brother on the screen.
Thanks to the Macintosh, a voice then intoned, 1984 would not be like 1984.
Like the Merry Pranksters in their bus, the ad implied, the executives of
Apple had unleashed a new technology on Americans that would, if they
only embraced it, make them free.
By 1984 the New Communalist movement had disappeared. Nevertheless, thanks in large part to the entrepreneurship of Stewart Brand and
the networks he assembled, its ideals lived on.

…

The great machines of empire had been miniaturized and turned over to individuals, and so transformed into tools
with which individuals could improve their own lives.
Like many myths, this one contains several grains of truth. The 1970s
did in fact witness the rise of a new form of computing, and Bay area programmers, many with countercultural leanings, played an important
part in that process. And as they were distributed, some of the new
computers—particularly the 1984AppleMacintosh—were explicitly
marketed as devices one could use to tear down bureaucracies and
achieve individual intellectual freedom. Yet, the notion that the
[ 103 ]
[ 104 ]
Chapter 4
counterculture gave rise to personal computing and computer networking
obscures the breadth and complexity of the actual encounter between the two
worlds. As Stewart Brand’s migrations across the 1960s suggest, New Communalistvisionsofconsciousnessandcommunityhadbecomeentangledwith
the cybernetic theories and interdisciplinary practices of high-technology
research long before computers were miniaturized or widely interlinked.

…

The hacker ethic helped make hackers particularly appealing to Stewart
Brand and Kevin Kelly. Soon after Levy had shown them his book, Brand
and Kelly got in touch with members of the hacking community, including
Lee Felsenstein; Bill Budge, a software author; Andy Hertzfeld, a key
member of Apple’sMacintosh development team; and Doug Carlston,
founder and president of Broderbund Software Inc. Together they invited
some four hundred self-described hackers to pay ninety dollars each to join
them, the Whole Earth crew, and about twenty mainstream journalists for
a three-day weekend in November 1984 at Fort Cronkhite, a former army
base in the Marin Headlands just across the Golden Gate Bridge from San
Francisco.
At one level, the event was a master stroke of networking. Having been
alerted to the existence of a new and potentially inﬂuential community by
a member of their own Whole Earth network (Levy), Brand and Kelly
reached out to that community and entrepreneurially extended and diversiﬁed their own networks.

The advertisement closes with the text, “On January 24th, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like 1984.19
Recounting the story of the ad, industry journalist Adelia Cellini noted, “Apple wanted the Mac to symbolize the idea of empowerment, with the ad showcasing the Mac as a tool for combating conformity and asserting originality. What better way to do that than have a striking blond athlete take a sledgehammer to the face of that ultimate symbol of conformity, Big Brother?”20 In many ways, this ad represents the apex of the personal computer revolution. The anti-institutional counterculture nerd ethos of the 1960s and 1970s had grown into the personal computer industry—an industry large enough by this time that it could now buy an advertisement during the Super Bowl. Yet the advertisement it used to make a splash still paradoxically evoked a deep strain of radical individualism and personal expression in the midst of conformity.

…

A liberationist ethic also became entrenched in the overt marketing of personal computing devices, most famously in a classic television commercial, Apple’s1984 spot.
Following the rousing success of Apple’s first two home computer models, Steve Jobs wanted to do something big to roll out its third model, the Macintosh personal computer. He hired Ridley Scott, who two years earlier had directed the sci-fi classic Bladerunner, to make the commercial.18 The result was a powerful and intense ad that referenced the dystopian future of George Orwell’s classic novel 1984. In the ad, a young woman breaks into a large auditorium where a crowd of mindless automatons sit listening to a giant screen of a speaking man, presumably Big Brother. The woman, representing the Macintosh (she has a sketch of the Mac on her tank top), smashes the screen. The advertisement closes with the text, “On January 24th, Apple Computer will introduce Macintosh.

It shows Big Brother projected on a screen, addressing lines of workers. These skinhead drones wear identical uniforms. Into the grey nightmare bursts an attractive young woman. She wears orange shorts and a white tank top. She is carrying a hammer! Police in riot gear run after her. As Big Brother announces ‘We shall prevail’, the heroine hurls the hammer at him. The screen explodes in a blaze of light; the workers are open-mouthed. A voice announces smoothly: ‘On January 24th, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like 1984.’
The 60-second advert was screened to nearly 100 million Americans during the Super Bowl, and was subsequently hailed as one of the best ever. Isaacson writes: ‘Initially the technologists and hippies didn’t interface well. Many in the counterculture saw computers as ominous and Orwellian, the province of the Pentagon and the power culture.’

…

That was still to come.
10
DON’T BE EVIL
Silicon Valley, California
Summer 2013
‘Until they become conscious, they will never rebel.’
GEORGE ORWELL,
1984
It was an iconic commercial. To accompany the launch of the Macintosh in 1984, Steve Jobs created an advert that would captivate the world. It would take the theme of George Orwell’s celebrated dystopian novel and recast it – with Apple as Winston Smith. His plucky company would fight the tyranny of Big Brother.
As Walter Isaacson recounts in his biography of Jobs, the Apple founder was a child of the counterculture. He practised Zen Buddhism, smoked pot, walked around barefoot and pursued faddish vegetarian diets. He embodied the ‘fusion of flower power and processor power’. Even as Apple grew into a multi-billion dollar corporation, Jobs continued to identify with computing’s early subversives and long-haired pioneers – the hackers, pirates, geeks and freaks that made the future possible.

…

Many in the counterculture saw computers as ominous and Orwellian, the province of the Pentagon and the power culture.’
The commercial asserted the opposite – that computers were cool, revolutionary and empowering, instruments of self-expression. The Macintosh was a way of asserting freedom against an all-seeing state.
Almost 30 years later, following Jobs’s death in 2011, an NSA analyst came up with a smirking rejoinder. He prepared a top-secret presentation and, to illustrate the opening slide, he pulled up a couple of stills from Jobs’s commercial – one of Big Brother, the other of the blonde heroine with the hammer and the orange shorts.
Under the heading ‘iPhone Location Services’ he typed:
‘Who knew in 1984 …’
The next slide showed the late Jobs, holding up an iPhone.
‘… that this would be Big Brother …’
A third slide showed crowds of whooping customers celebrating after buying the iPhone 4; one fan had inked the name on his cheek.

Walter Thompson, where he stayed until Eastman Kodak, a client, recruited him to be its director of marketing. Then, in 1983, John Sculley, recently appointed the CEO of Apple, heard about Campbell from a relative and began courting him for the job of vice president of marketing. He clinched the sale by demonstrating for Campbell the revolutionary Macintosh computer, which Apple would introduce in 1984. “It would be pretty unusual today to hire a football coach to be your VP of sales,” Sculley later told a reporter. “But what I was looking for was someone who could help develop Apple into an organization.” Campbell took over sales as well as marketing just months after he joined Apple, and set about firing the consultants and most of a sales force that “wore polyester pants and gold chains.” He said he replaced them with recent college graduates, half of them women, and all hungry to succeed.

…

Steve Jobs and Apple are wave makers; companies like Dell—or Quincy Smith’s CBS and Irwin Gotlieb’s GroupM—attempt to ride the wave; newspapers crash into them. The Apple wave started with the Apple II, which launched the PC era in 1977; followed in 1984 by the Macintosh, with its innovative graphical user interface; followed by Pixar studios, which transformed movie animation; followed by the iPod and iTunes and the iPhone. It’s probably safe to say that Intel and HP created waves. Ditto Amazon. There are those who say Microsoft doesn’t qualify because it rode the waves others invented, but it is inarguable that it has thrived for three decades and changed computing. It is much too soon to know whether companies like Facebook, YouTube, Twitter, or Wikipedia will have a lasting impact.
It is not too early, however, to call Google a wave maker.

…

Campbell’s boldness appealed to the ever-rebellious Steve Jobs. The two men bonded. By 1984, said Campbell, “Sculley and Jobs were going at each other already.” Although Jobs had recruited Sculley to bring professional management to Apple, he came to think he was more interested in marketing, including marketing himself, than in Apple products; Sculley believed Jobs wanted an acolyte, not a CEO. Nevertheless, Campbell earned the rare distinction of being able to both befriend Jobs and command Sculley’s respect. Before Sculley succeeded in pushing Jobs out of Apple in 1985, Campbell warned him it would be a huge mistake. Tensions flared between Campbell and Sculley, and in 1987 Campbell was put in charge of Apple’s Claris software division, with the intention of spinning it off as a private company with Campbell at the helm.

I focused my studies at the University of Southern California on building companies and creating new technologies,
and ran Liberty Software out of my dorm room. The lessons I
learned as an entrepreneur were pivotal, as were those I learned
working for somebody else. In 1984, I had a summer job at
Apple writing some of the ﬁrst native assembly language for the
Macintosh. I had the opportunity to work on the most exciting
and important project at Apple, and it was like getting paid to go
to Disneyland. There were fruit smoothies in the refrigerators, a
motorcycle in the lobby, and shiatsu massages.
The very best part was being able to witness Steve Jobs
walking around, motivating the developers. Steve’s leadership
created the energy and spirit in the ofﬁce. Apple encouraged
the ‘‘think different’’ mind-set throughout its entire organization. We even had a pirate ﬂag on the roof. That summer, I
discovered that it was possible for an entrepreneur to encourage
revolutionary ideas and foster a distinctive culture.
xix
INTRODUCTION
That lesson became even more obvious when I returned
to Apple for a second summer internship as a technical sales
support person with an Apple partner.

…

Events proved to be an effective way to maximize the viral
effect.
Play #28: Build Street Teams and
Leverage Testimony
Although I had been inspired by the customer energy that Steve
Jobs had built around the Macintosh, the idea for cultivating
a group of salesforce.com enthusiasts did not come from the
technology community; it emanated from the hip-hop community. A friend introduced me to MC Hammer, who visited our
San Francisco ofﬁce (wearing a business suit, not the trademark
Hammer pants) and shared his ‘‘Street Team’’ concept: that of
building local networks of people to back you. At the time, I
didn’t know how salesforce.com Street Teams would work in
action, but I thought that MC Hammer was a creative genius
and that this unconventional idea was worth investigating.
Our City Tour program served as a vehicle to extend the
salesforce.com message, ignite passion behind the idea, and help
us build salesforce.com Street Teams to get customers out and
selling for us on a local level.

…

Oracle had about two hundred people when I started, and
the fast-growing company prized the efforts of young people
and rewarded them. Founder and CEO Larry Ellison regularly
xx
Introduction
walked the halls to chat with employees. (I usually took these
opportunities to share my enthusiasm for Macs.) Soon after I
sent Larry a note asking when Oracle would be on the Macintosh
and included a business plan about how to make us successful
in the Apple market, Larry made me the director of Oracle’s
Macintosh division.
Being responsible for the division that created software for
personal computers was an amazing opportunity. Then, after
Tom Siebel, the executive who ran direct marketing, resigned
and recommended me as his replacement, I inherited an even
more exciting and formative role.
It was Larry’s vision that inspired me. He wanted me to
create an ‘‘electronic village’’ and the next generation of sales
and marketing using state-of-the-art electronic conferencing
technology, software systems, and multimedia.

While Apple clearly had to be financially successful, its more fundamental purpose was to innovate, invent, and lead an entire cultural revolution that
Page 76
everyone there saw leaping from those keyboards and screens with silicon brains. Apple's 1984Super Bowl commercial, where the free thinking individual charges through the faceless, gray crowd to shatter the tyranny of Big Brother, was gospel, not hype, throughout the Apple organization. All the people I met there, passionate young people, truly believed they were changing the world, not selling computers.
I took a mental health day and rode my bicycle mile after mile through the backcountry of Marin county. Seventy-five miles later, I still didn't know what to do. The next morning, I walked into work, puzzling through the options. I can still remember sitting in my office at the end of the hall and looking down the long corridor.

…

Then the competition would look sufficiently like Apple to erode Apple's margins and back it into a corner of its own making, with declining share and profit.
Along with many others inside Apple, I was a strong proponent of licensing the Macintosh operating system in order to preempt Microsoft in setting the standard for user-friendly computing. After all, it was Apple's birthright, its overriding
Page 109
mission. It would mean cannibalizing our own model, sacrificing margins for volume and market share, but it seemed better than circling the wagons and defending an ever-declining piece of the PC business. Apple's general counsel, my boss, asked me to develop a licensing plan for the Mac operating system, with safeguards for protecting Apple's basic interests.
In a first step toward a new strategy, a colleague and I were assigned to negotiate a license of the Mac look and feel to Apollo Computer in Massachusetts, one of the leading manufacturers of workstations at the time.

…

Her ambivalence and Lenny's focus on the formula over the mission brought to mind my experience at Apple, specifically one of the most pivotal negotiations I was involved in there, which was reported only recently for the first time.
Apple's big idea had been Computing for the Rest of Us. But the company increasingly found itself hostage to the margins and quarterly results generated by its business model, which was built around premium hardware. Its share of the PC business was limited as it became addicted to selling computers at much higher margins and prices than its competition. Its intuitive, friendly interface was the justification for those margins, but that business model and Apple's position were threatened by none other than Microsoft. In 1986 we had all seen Windows 1.0, and while it posed no threat to the Macintosh operating environment at the time, we understood what it meant.

Jobs went back to Cupertino and called a board meeting, saying he had to build a new computer based on the PARC architecture and that it should not be backward-compatible with the existing Apple II. The board thought he was crazy, but Jobs applied his charisma—his “reality distortion field”—and got his way. Xerox got its Apple shares, and in December of 1980, Apple went public at $22 per share. Xerox’s holdings were instantly worth millions.
The first version of a computer using the PARC architecture, the Lisa, was a commercial failure, but when Jobs introduced the Macintosh in an iconic advertisement that aired during the 1984Super Bowl, the long-awaited vision of the future arrived. The tragedy for Xerox was that two years later, the Xerox CFO sold all its Apple stock. Imagine what it would have meant to the company if it had held on to 5 percent of Apple, which would now be worth about $32 billion. In 1985, after the debut of the Macintosh, Microsoft quickly introduced Windows, an operating system that totally mimicked the Macintosh.

…

Less than two minutes into the San Francisco demo, Engelbart said, “If in your office you as an intellectual worker were supplied with a computer display backed up by a computer that was alive for you all day and was instantly responsive to every action you have, how much value could you derive from that?” Engelbart had built a working prototype of what we today would easily recognize as a contemporary Internet device—fifteen years before the introduction of the AppleMacintosh.
The next year Engelbart took a team from the Stanford Research Institute to the Lama Foundation commune, north of Taos, New Mexico. It was Stewart Brand who suggested that Lama might provide an atmosphere, as John Markoff wrote, “to create a meeting of the minds between the NLS researchers and the counterculture community animated by the Whole Earth Catalog.” The land outside Taos was full of alternative communities—Morningstar East, Reality Construction Company, the Hog Farm, New Buffalo, and the Family, to name a few.

…

In 1985, after the debut of the Macintosh, Microsoft quickly introduced Windows, an operating system that totally mimicked the Macintosh. Whatever advantage Apple had was quickly extinguished, and Steve Jobs was forced out of the company.
Jobs immediately set out for revenge on his old company by building a new computer called NeXT. Not long after that, a twenty-nine-year-old English engineer, Tim Berners-Lee, took up a position at the Conseil Européen pour la Recherche Nucléaire (CERN). The Internet at this point was purely an academic research network linking physicists around the world and allowing them to share research documents, and CERN was the largest European node of the network. Finding documents was getting increasingly dicey as the network got more popular, so Berners-Lee began to work on the concept of hypertext as a way for researchers to link directly to other documents in their references.

But creating a mythology around that product is, especially in the early stages, as important to selling it as anything else.
Traditional marketing campaigns are important too, of course, and Apple has run plenty of iPhone ads. There hasn’t been a truly classic iPhone spot or campaign, on the level of the famous Ridley Scott–directed “Big Brother” ad that introduced the Macintosh during the 1984Super Bowl, the “Think Different” ads that reminded audiences that the Apple brand was associated with geniuses and world-changers in the late 1990s, the earbuds-and-silhouette campaign that created an efficient aesthetic shorthand for iPod cool in the early 2000s, or even the “I’m a Mac,” “I’m a PC” ads that played off Windows-based computers’ reputation for being buggy and lame.
The closest thing the iPhone has to a classic is probably the “There’s an App for that” campaign in 2008.

…

They helped prove that user interface design, long derided as dull—the province of grey user settings and drop-down menus; “knobs and dials” as Christie puts it—was ripe for innovation. As Bas and Imran’s stars rose inside Apple, they started casting around for new frontiers.
Fortunately, they were about to find one.
While training to be a civil engineer in Massachusetts, Brian Huppi idly picked up Steven Levy’s Insanely Great. The book documents how in the early 1980s Steve Jobs separated key Apple players from the rest of the company, flew a pirate flag above their department, and drove them to build the pioneering Macintosh. Huppi couldn’t put it down. “I was like, ‘Wow, what would it be like to work at a place like Apple?’” At that, he quit his program and went back to school for mechanical engineering. Then he heard Jobs was back at the helm at Apple—serendipity. Huppi landed a job as an input engineer there in 1998.
He was put to work on the iBook laptop, where he got to know the Industrial Design group, whose profile had already begun to rise under its head, Jonathan Ive.

…

It would be ideal for a trackpad as well as a touchscreen tablet; an idea long pursued but never perfected in the consumer market—and one certainly interesting to the vets of the Newton (which had a resistive touch screen) who still hoped to see mobile computing take off.
And it wouldn’t be the first time a merry band of Apple inventors plumbed another organization for UI inspiration. In fact, Silicon Valley’s premier Prometheus myth is rife with parallels: In 1979, a young squad of Apple engineers, led by Steve Jobs, visited the Xerox Palo Alto Research Center and laid eyes on its groundbreaking graphical user interface (GUI) boasting windows, icons, and menus. Jobs and his band of “pirates” borrowed some of those ideas for the embryonic Macintosh. When Bill Gates created Windows, Jobs screamed at him for stealing Apple’s work. Gates responded coolly: “Well, Steve, I think there’s more than one way of looking at it. I think it’s more like we both had this rich neighbor named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it.”

Writing his .plan updates was becoming increasingly laborious because, as
Carmack knew, everyone seemed to be hanging so much on what he said.
“Some of you,” he finally typed, “are busy getting all bent out of shape about
this.”
Carmack was talking about the gaming community’s reaction to id’s announcement that the first test of their next game, Quake III Arena, would be
released for Macintosh, not Windows. In the gaming world, this was usually
as big as the controversies got. But while Carmack turned his attention to his
plan, describing the pros and cons of the new Macintosh systems, he couldn’t
avoid the other controversy. Finishing his update, he pushed himself away
from his desk and walked down the hall to get a Diet Coke and a snack.
“Hey,” he said, passing the police officers in his lobby, “do you guys want
anything to eat?”
The cops were the most obvious sign of Columbine’s impact on id.

…

A new one had begun.
176
THIRTEEN
Deathmatch
I
n a dark room pulsing with blood red shadows, Stevie “Killcreek” Case
sat at her computer, twitching her body as if she were repeatedly and
intentionally sticking her toe in a light socket. “Doh!” she yelped, leaping her soldier on the screen through a static-filled teleporter gate, only to see
him rematerialize in an unanticipated blizzard of nails. Or, as she described
the style of this particular death, “Telefragged!”
It was January 1997, minutes away from the online gaming underground’s
unofficial Super Bowl. Like the few dozen others convulsing throughout this
University of Kansas flophouse, Stevie–an ebullient twenty-year-old with a
short brown hob–had been practicing two sleepless nights for the match between her team, Impulse 9, and their rivals, who had driven eight hours from
Michigan, the Ruthless Bastards. Their contests were part of the burgeoning
international subculture of clans: organized groups of gamers who played–
and lived–Quake.

…

The one he liked best
soon became his girlfriend, a popular, intelligent, and outgoing daughter of a
respected officer. She had him buy button-down shirts, wear nice jeans and
contacts. After years of being beaten down by his father and his stepfather,
Romero was finally getting recognition.
At sixteen, Romero was just as eager to have success with his games.
After eight months of rejections, the good news came on March 5, 1984, from
an Apple magazine called InCider. An editor, weary from a recent trip to
Mardi Gras, wrote that the magazine had decided to publish the code for
Romero’s Scout Search, a low-resolution maze game in which the player–
represented by a single dot–had to gather all his scouts–more dots–before
being attacked by a grizzly bear–another dot. It didn’t look great, but it was
fun to play. Romero would be paid $100.

Calibrating the right level of openness is undoubtedly one of the most complex as well as one of the most critical decisions that a platform business must make.4 The decision affects usage, developer participation, monetization, and regulation. It’s a challenge that Steve Jobs struggled with throughout his career. In the 1980s, he got it wrong by choosing to keep the AppleMacintosh a closed system. Competitor Microsoft opened its less elegant operating system to outside developers and licensed it to a host of computer manufacturers. The resulting flood of innovation enabled Windows to claim a share of the personal computer market that dwarfed Apple’s. In the 2000s, Jobs got the balance right: he opened the iPhone’s operating system, made iTunes available on Windows, and captured the lion’s share of the smartphone market from rivals like Nokia and Blackberry.5
Jobs liked to recast the open/closed dilemma as a choice between “fragmented” and “integrated,” terms that subtly skewed the debate in favor of a closed, controlled system.

…

In some cases, both the platform manager and the platform sponsor can be either a single company or a group of companies—with further implications for issues of control and openness.10
Figure 7.3 illustrates four models for managing and sponsoring platforms. In some cases, a single firm both manages and sponsors the platform. We call this the proprietary model. For example, the hardware, software, and underlying technical standards for the Macintosh operating system and mobile iOS are all controlled by Apple.
Sometimes a group of firms manages the platform while one firm sponsors it. This is the licensing model. Google, for example, sponsors the “stock” Android operating system, but it encourages a number of hardware firms to supply devices that connect consumers to the platform. These device makers, including Samsung, Sony, LG, Motorola, Huawei, and Amazon, are licensed by Google to manage the interface between producers and consumers.

What about a Google-ized version of Apple’s Safari browser? Jobs bonded especially with Brin; both lived in Palo Alto, and the pair would take long walks around the town and up in the hills … current and future kings of the Valley, inventing the future.
In August 2006, Jobs invited Eric Schmidt to sit on Apple’s board of directors, which included Google board member Arthur Levinson, CEO of Genetech; and Bill Campbell, Google’s corporate coach. Al Gore sat on Apple’s board, while he was the self-described “virtual advisory board” at Google. Intel CEO Paul Otellini, who was on Google’s board, had started supplying the chips for Macintosh computers. There was so much overlap that it was almost as if Apple and Google were a single company.
Smart phones seemed to be the logical nexus of the unofficial partnership.

…

Michael Brin also talked about his son in Tom Howell, “Raising an Internet Giant,” University of Maryland Diamondback; and Adam Tanner, “Google Co-founder Lives Modestly, Émigré Dad Says,” USA Today, April 6, 2004; and Mark Malseed, “The Story of Sergey Brin,” Moment, February 2007. Malseed expanded on his research in The Google Story.
15 “Suppose all the information” Tim Berners-Lee, Weaving the Web (New York: HarperBusiness, 2000), p. 4.
15 The web’s pedigree I give a detailed account of the work of Bush, Englebart, and Atkinson in Insanely Great: The Story of Macintosh, the Computer That Changed Everything (New York: Penguin, 1994), and discuss Nelson’s work in Hackers: Heroes of the Computer Revolution (New York: Doubleday, 1984).
16 personalized movie ratings Sergey Brin, résumé at http://infolab.stanford.edu/~sergey/.
17 “Why don’t we use the links” Page and Brin spoke to me in 2002 about developing the early search engine, a subject we also discussed in conversations in 1999, 2001, and 2004.
17 “The early versions of hypertext” Battelle, The Search, p. 72.
20 “For thirty years” Carolyn Crouch et al., “In Memoriam: Gerald Salton, March 8, 1927–August 28, 1995,” Journal of the American Society for Information Science 47(2), 108; “Salton Dies; Was Leader in Information Retrieval Field,” Computing Research Association website.
20 the web was winning I looked at the state of web search in “Search for Tomorrow,” Newsweek, October 28, 1996.
21 “The idea behind PageRank” John Ince, “The Lost Google Tapes,” a series of interviews with Google.

…

The timeline continued to the work of Douglas Engelbart, whose team at the Stanford Research Institute devised a linked document system that lived behind a dazzling interface that introduced the metaphors of windows and files to the digital desktop. Then came a detour to the brilliant but erratic work of an autodidact named Ted Nelson, whose ambitious Xanadu Project (though never completed) was a vision of disparate information linked by “hypertext” connections. Nelson’s work inspired Bill Atkinson, a software engineer who had been part of the original Macintosh team; in 1987 he came up with a link-based system called HyperCard, which he sold to Apple for $100,000 on the condition that the company give it away to all its users. But to really fulfill Vannevar Bush’s vision, you needed a huge system where people could freely post and link their documents.
By the time Berners-Lee had his epiphany, that system was in place: the Internet. While the earliest websites were just ways to distribute academic papers more efficiently, soon people began writing sites with information of all sorts, and others created sites just for fun.

For example, in order to maintain its position as the Olympic network, the National Broadcasting Corporation (NBC)
invested $3.55 billion for television rights to the three Summer and two
Winter Olympiads between 2000 and 2008.35 Broadcast rights for the Super
Bowl also represent a signiﬁcant part of the shared $17.6 billion eight-year
contract signed by the NFL and ABC/ESPN, CBS, and Fox in 1998. Having
137
David
L. A n d r e w s
effectively purchased the American population’s sporting attention, it is subsequently leased for exorbitant sums to corporate advertisers. Jon Mandel of
Grey Advertising noted that “When you think that virtually half the country’s watching the Super Bowl . . . this makes a hell of a statement.”36
Mandel was referring to Super Bowl 22 in 1997, which tied for the thirdmost-watched television show in U.S. history. Hence, in 1999 the popular
appeal of the Super Bowl spectacle enabled Fox to charge $1.6 million for
each of the thirty-second advertisement spots (of which there were ﬁftyeight in total), a ﬁgure that is expected to rise to $1.9 million per spot for
2000.37 Similarly, NBC charges elevated advertising rates for its near three
weeks of prime-time Olympic coverage, making its television rights a highly
proﬁtable investment (see later discussion of the Olympic Games).

…

At its zenith, the “Wintel”
monopoly so dominated that the two companies claimed half the proﬁts of
the entire PC industry, reducing PC makers to what one journalist called “a
value-added reseller for Intel and Microsoft.”28
Wintel’s dominance cannot be attributed simply to consumer choice or
technological superiority. Both companies gained their footholds in the PC
market when IBM chose them to supply the microprocessor and operating
system for its PCs in the early 1980s. By undercutting prices for the rival
Macintosh computer, IBM (and the low-cost PC clone makers that followed
it) grabbed the bulk of the market for business PCs. Throughout the 1980s,
Apple offered a more user-friendly graphical interface than Microsoft, and
at several points Intel’s competitors offered faster chips.29 But the large installed base of Wintel users created growing network effects, whereby the
greater the number of users of a communication technology, the more valuable it becomes to each of them, because users can share information with
more people.

…

For TV and radio, sport gets consumers in front of
their sets to hear and see commercials; in effect, TV and radio rent their
viewers’ and listeners’ attention.”32
Despite recent declines in television ratings caused by an ever-fragmenting
media culture, sporting “mega-events”33—such as the Super Bowl, the Olympic Games, the NBA Finals, the MLB World Series, and the FIFA World
Cup Final—continue to represent some of a dwindling number of collective
media experiences that provide a thread of commonality (regardless of how
ephemeral) in the life of a nation. For instance, of the ten largest audiences
for shows on American television, nine have been sport-related: seven involved Super Bowl programming, and two were of coverage of the 1994
Winter Olympics (the Nancy Kerrigan and Tonya Harding skate-off ). The
remaining one, in ninth place overall, was the 1983 M*A*S*H special that
concluded the long-running comedy series.34
The clamor for audience ratings has led to network television moguls’
perpetual engagement in a circus of spiraling bidding wars for the exclusive
rights to these high-proﬁle events.

Even so, he knew how to charm and chat, and he could set his earnestness aside. “It’s so rare,” the Duchess of Orléans declared happily, “for intellectuals to be smartly dressed, and not to smell, and to understand jokes.”
Leibniz was greatly impressed by a demonstration of “a Machine for walking on water,” which was apparently akin to this arrangement of inflatable pants and ankle paddles.
Today we slap the word genius on every football coach who wins a Super Bowl, but both Newton and Leibniz commanded intellectual powers that dazzled even their enemies. If their talents were on a par, their styles were completely different. In his day-to-day life, as well as in his work, Leibniz was always riding off boldly in all directions at once. “To remain fixed in one place like a stake in the ground” was torture, he remarked, and he acknowledged that he “burned with the desire to win great fame in the sciences and to see the world.”

…

Some art historians believe that Vermeer’s Astronomer and his Geographer both depict Leeuwenhoek, but no one has been able to prove that Leeuwenhoek and Vermeer ever met.
26 The microscope that Leeuwenhoek used on that fateful night was put up for auction in April 2009. The winning bidder paid $480,000.
27 As one of Pythagoras’s followers told the tale, the story began when Pythagoras listened to the sound of hammering as he walked by a blacksmith’s shop. As the blacksmith struck the same piece of iron with different hammers, some sounds were harmonious, others not. The key, Pythagoras found, was whether the weights of the hammers happened to be in simple proportion. A twelve-pound hammer and a six-pound hammer, for instance, produced notes an octave apart.
28 Augustine did not explain why God did not make the world in 28 days (1 + 2 + 4 + 7 + 14) or 496 days or various other possibilities.
29 A prime number is one that can’t be broken down into smaller pieces.

…

JUST CRAZY ENOUGH
301 Molière long ago made fun: Thomas Kuhn famously cited Molière in The Structure of Scientific Revolutions, p. 104.
302 “We are all agreed that your theory is crazy”: Bohr made the remark to Wolfgang Pauli and added, “My own feeling is that it is not crazy enough.” Dael Wolfle, ed., Symposium on Basic Research (Washington, DC: American Association for the Advancement of Science, 1959), p. 66.
302fn In time, this bewilderment: J. J. MacIntosh, “Locke and Boyle on Miracles and God’s Existence,” p. 196.
303 “He claims that a body attracts”: Brown, “Leibniz-Caroline Correspondence,” p. 273.
303 “Mysterious though it was”: John Henry, “Pray do not Ascribe that Notion to me: God and Newton’s Gravity,” in Force and Popkin, eds., The Books of Nature and Scripture, p. 141.
303 “even if an angel”: Brown, “Leibniz-Caroline Correspondence,” p. 291.
304 If the sun suddenly exploded: Brian Greene, The Elegant Universe (New York: Norton, 1999), p. 56.
305 “so great an absurdity”: Westfall, Never at Rest, p. 505.
305 “To tell us that every Species”: From the end of Opticks, quoted in Kuhn, The Copernican Revolution, p. 259.
306 “as if it were a Crime”: Westfall, Never at Rest, p. 779.
306 “Ye cause of gravity”: Ibid., p. 505.
306 “I have not been able to discover”: Cohen’s translation of the Principia, p. 428.

I said, “We have large scale assessment programs that happen only once a year and I need a tremendous amount of computing power. You know what that computing power does the rest of the year?”
Yourdon: Just gathers dust.
Wakeman: Right, it does nothing but burn up electricity.
Yourdon: Yeah. And it’s amazing how many situations there are like that. I think of the Oscars or the Olympics.
Wakeman: Super Bowl.
Yourdon: Super Bowl. Yeah. On and on and on. Christmas shopping season for most of the retail industry.
Wakeman: Mm-hmm.
Yourdon: Yeah, it is, it is amazing to think about it. In terms of futures, I’ve got a related kind of social question. This whole issue of the “digital nation,” the Gen X or Y or Z or whatever generation it is that’s grown up with computers, how do you see them impacting what you do here at ETS?

…

The sixteen CIOs interviewed in this book represent hundreds of years of experience. Read what they have to say and benefit from their experience!
New York, NY
Ed Yourdon
June, 2011
Benjamin Fried
CIO, Google Inc.
Benjamin Fried is Chief Information Officer of Google Inc., overseeing the company’s global technology systems. His extensive hands-on experience in technology includes stints as a dBASE II programmer, front-line support manager, Macintosh developer, Windows 1.0 programmer, and UNIX systems programmer. Prior to joining Google, he spent more than 13 years in Morgan Stanley’s technology department, where he rose to the level of Managing Director. During his time there, he led teams responsible for software development technology, web and electronic commerce technologies and operations, and technologies for knowledge workers.
Ben received his degree in Computer Science from Columbia University.

That’s a thought most corporate execs are going to meet with about as much enthusiasm as Dwyane Wade would if he were suddenly faced with undeniable proof that basketball was dead and ice hockey was the only game left.* Yet let’s remember it wasn’t so long ago that the few people who owned home computers used them almost exclusively for word processing and video games. In 1984, you’d get stuffed in your locker for gloating over your new AppleMacintosh; in 2007 you could score a hot date by showing off your new iPhone. Culture changes, and business has to change with it or die.
* * *
Why I Speak in Absolutes
Because if I give you an inch, you’ll run a mile with it. When I said in 1998, “You’re dead if you don’t put your business on the Internet and get in on ecommerce,” was that true? No. But boy, can you imagine trying to be in business in 2010 with zero Web presence?

…

That was culturally unacceptable in my company.
Leadership and Culture
Bill Parcells is the best coach of all time. Screw Phil Jackson—I could have won a few championships with Jordan, Shaq, and Kobe on my teams. Parcells is the greatest coach in history because he went to a rotten New York Giants team and won two Super Bowls; went to the New York Jets, who had won four games in two years, and in two short years got them within one game of the Super Bowl; went to the Patriots, who were one in fifteen, and took them to the Super Bowl; went to Dallas and made them a consistent playoffs contender; and then to Miami, where he coached the biggest turnaround in one season in NFL history. He wins through building team morale, hiring the right people, and instilling the right culture. He brings his DNA. In this new world where people can communicate more freely with not just customers, but with employees too, the Bill Parcells style of leadership will become more and more necessary.

…

You might take a walk, duck into a bookstore, or stop in at the retro vinyl shop. If you’re on a fabulous date, you don’t want the night to end, and you’re going to try to find any way you can to keep the conversation and connection going.
Combining traditional and social media can allow you to do the same thing when talking to people about your brand. Denny’s, for example, had a great TV date with its customers during the 2010 Super Bowl. It ran three commercials announcing that for a few hours on the following Tuesday, you could come in for a free Grand Slam breakfast. The ads were funny and creative—chickens freaking out over how many eggs they were going to have to lay for the event—but what a missed opportunity to leverage all the people watching the ads with their laptops open in front of them! All Denny’s had to do was say, “Go to Facebook.com/Denny’s right now, become a fan [an option that was supplanted by the “Like” button], and receive a coupon for an additional free large OJ.”

Is he telling the truth or pretending that he didn't do it, that he just
knows of Rochester, the site revealed in Markoff's original article? Or is he revealing a more
tantalizing possibility? That there may have been other people involved in the attack on Shimomura.
■
■
■
"Mr. Jon," Kevin Mitnick welcomes me hours later and we chat briefly about the Super Bowl. He
enjoyed the commercials, particularly the one with the computerized frogs croaking "Bud-weis-er" in
sequence.
I can hear the first rumblings of a Mitnick belly laugh. "I was thinking of getting in the P link [one of
AT&T's satellite phone links] and sending, "Hi, Shimomura, die with honor [broadcasting it
worldwide to hundreds of millions of Super Bowl viewers]."
Then, suddenly, Mitnick is pissed. "I read that shit [Markoff's Times profile of Shimomura]. He said
now he considers it a matter of honor.... Remember I told you that Markoff has an [e-mail] account on
Shimomura's system?

…

That's not what
Kevin Mitnick and all the other hackers I talk to say. It's not what countless articles in newspapers and
magazines say. It's not even what John Markoff used to say. Every cyberspace journalist worth his
memory chips knows security on the Internet is an illusion, and always has been.
The Internet is about as safe as a convenience store in East L.A. on Saturday night.
January 29,1995
It's Super Bowl Sunday, a couple of hours before kickoff, and though I'm not a big football fan, I plan
on watching the San Francisco 49ers demolish the San Diego Chargers.
I pick up the phone, thinking it's my friend, the one who's supposed to bring the guacamole, but instead
it's Kevin Mitnick. It's been six days since his last call.
"I'm walking along the beach here relaxing," Mitnick bubbles, sounding euphoric.

…

Eric may be a fugitive on the run, but when it comes to his story, he's in total control. He pauses a
moment and then coolly orders, "Stand by. There's some movement here. . . ."
Have they already trapped and traced the call? Is the FBI moving in for the bust? Should I hang up?
"It was nothing," Eric deadpans a few seconds later.
I've climbed the steep steps to my cluttered attic office, switched on the lamp, and booted up my
Macintosh. I'm in my pajamas.
"So how badly do the feds want you?"
"I think they don't care. Schindler probably does, but I think he realizes he's got a can of worms on his
hands if he finds me. I'm one of the few defendants that has ever had extensive personal phone calls
with Schindler. We've been very much on a first-name
basis for some time. It makes him very nervous now that I'm out here."
■
■
■
Eric makes his smooth, Hollywood sales pitch.

And while neither Ford nor Edison died as publicly or as young as Jobs, it’s unlikely the passing of either one of them would ever have been met with the outpourings of grief or mounds of flowers with which Jobs’s death was met, with Apple stores becoming impromptu gathering places for the stricken masses.
The reason—the thing that distinguished Jobs from the other greats—was that he understood something, and that something was the people around him. His very first Apple products may have followed the beige box model that was the computer standard at the time, but there was something primally appealing about them, too. There was the little rainbow apple that was the company’s logo—a tiny icon that didn’t invite you to work with the machine as much as play with it. The beige box soon gave way to a white box—a small, streamlined thing that you wanted to look at, to display, not just use. When the first little Macintosh came along—just a year before Jobs left the company—the central image in all of the ads was the computer itself with the loopy, cursive word “Hello” written on the screen.

…

You can be a National Football League flameout like Ryan Leaf, the number two player picked in the 1998 draft, who was out of the league entirely by 2002 after four years of indifference, poor play and multiple ugly public incidents including an infamous moment, caught on videotape, in which he stood over a frightened-looking sportswriter who had apparently asked the wrong question, screaming, “Just fucking don’t talk to me, all right? Knock it off!” Or you can be Super Bowl winner Peyton Manning, picked number one in the same year Leaf was drafted, who has spent the better part of two decades winning games and smashing records and whose only scandalous moment in his long career was . . . well, never mind. There never was one.
So what makes the difference between a Palin and a Rubio, a Leaf and a Manning, a Sheen and, say, a Michael J. Fox—whose nice-guy image was established back in the days of his star turns on TV’s Family Ties and in the Back to the Future movies, and whose grace in battling Parkinson’s disease has simply confirmed the high opinion most of the world had of him?

…

He stood before the newsreel cameras and the cheering reporters and did what most people do in that situation, which was to begin handing out thanks—to the university, the foundation that funded his work, the drug companies that manufactured the test vaccine, the dozens of children who volunteered to take the earliest formulation of it, the hundreds of thousands who later stepped forward as guinea pigs for the final version, nearly anyone who had even brushed up against the enterprise in the long years that had preceded that day. And yet somehow—oversight, nerves, an overweening ego at last showing itself (his critics preferred that explanation)—he never mentioned a single member of his lab team, the people who had done more than anyone else to make the vaccine a reality. It would be like a Super Bowl–winning coach acknowledging everyone but the players, a victorious general thanking everyone but the foot soldiers. For the stunned lab workers sitting in the audience, the day turned instantly to ash.
“Young man,” legendary CBS newsman Edward R. Murrow said to Salk when he buttonholed him in the back of the hall after the presentation, “a great tragedy has befallen you. You’ve lost your anonymity.”

That was going to be my next stop in life. I had done Wall Street, and I was going to do the White House next.”
1984
On January 24, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like 1984.… BANK SECURITIES UNITS MAY UNDERWRITE BONDS … It’s morning again in America, and under the leadership of President Reagan, our country is prouder and stronger and better. Why would we ever want to return to where we were less than four short years ago?… I had a job, I had a girl / I had something going mister in this world / I got laid off down at the lumberyard / Our love went bad, times got hard … TAMPA SEES GAINS FOR ITS HARD WORK “But those kinds of things can’t do for us, long-term, what a Super Bowl can do. This is a real opportunity for us to show people what a great place this is, that they can come here and not expect to get taken advantage of.” … MISS AMERICA IS ORDERED TO QUIT FOR POSING NUDE … You’re judged by performance.

…

The words appeared on billboards, bumper stickers, and T-shirts, and who could doubt that they would prove true when Tampa had a new international airport, it had the 1984Super Bowl, it had the NFL Buccaneers, it had the eleven million square feet of the Westshore business and shopping district, it had sunshine and beaches, and it was growing as fast as anywhere in the country? Fifty million new people came to Florida every year, and since the sunshine and beaches weren’t going anywhere, Tampa would continue to grow, and by growing, become great.
It grew and grew. It grew in order to grow. It grew throughout the eighties, in good economic times and bad, when pro-growth conservatives ran the Hillsborough County Commission and when pro-planning progressives ran the county commission. It grew throughout the nineties, when Tampa Bay got the NHL Lightning and the major league Devil Rays, plus another Super Bowl. After the millennium it grew like gangbusters.

…

Between the midseventies and the early nineties, the personal computer had spawned countless hardware and software companies in Silicon Valley, and in other high-tech centers around the country; during the seventies and eighties the population of San Jose doubled, approaching a million, and by 1994 there were 315 public companies in the Valley. But none of the newer ones had been as important as Hewlett-Packard, Intel, or Apple. In the years since the Macintosh, the computer industry had seen more consolidation than innovation, and the undisputed winner was in Seattle.
The most important Silicon Valley company to come along since Apple was originally called Mosaic, started in 1994 by Jim Clark, a former Stanford professor and founder of Silicon Graphics, and Marc Andreessen, a University of Illinois graduate who, at twenty-two, had just the year before developed the first graphical browser for the World Wide Web. In 1995, the year that the last restrictions on commercial use of the Internet were lifted, their company went public as Netscape, headquartered south of Stanford in Mountain View.

After online criticism, some Humans at Uber decided to offer free rides and to refund people who had paid (Sullivan, 2014).
‡ Notably, an even larger organization—the NFL—recognizes and ascribes to this same piece of advice. In an interview with economist Alan B. Krueger, the NFL’s VP for public relations, Greg Aiello, explained that his organization takes a “long-term strategic view” toward ticket pricing, at least for the Super Bowl. Even though the high demand for Super Bowl tickets might justify significantly higher prices (and short-term profits—he calculates the profit increase as on the same scale as all advertising revenues), the organization intentionally keeps these prices reasonable in order to foster its “ongoing relationship with fans and business associates” (Krueger, 2001).
15
Fairness Games
One question was very much on the minds of Danny, Jack, and me while we were doing our fairness project.

…

Likewise in the draft, when a team falls in love with a certain player they are just sure that every other team shares their view. They try to jump to the head of the line before another team steals their guy.
5. Present bias. Team owners, coaches, and general managers all want to win now. For the players selected at the top of the draft, there is always the possibility, often illusory, as in the case of Ricky Williams, that the player will immediately turn a losing team into a winner or a winning team into a Super Bowl champion. Teams want to win now!
So our basic hypothesis was that early picks were overvalued, meaning that the market for draft picks did not satisfy the efficient market hypothesis. Fortunately, we were able to get all the data we needed to rigorously test this hypothesis.
The first step in our analysis was just to estimate the market value of picks. Since picks are often traded, we could use the historical trade data to estimate the relative value of picks.

…

The following year, he did not return to the top form he had showed as a rookie, and the Redskins had a terrible season, so bad that the 2014 first-round pick the Redskins had given the Rams turned out to be the second pick in that draft, so giving up that pick turned out to be very expensive. (Recall that it was a number two pick that the Redskins had originally traded up to get.) The 2014 season was also a disappointing one for RG3. In hindsight, another player named Russell Wilson, who was not picked until the third round, appears to be better and less injury-prone than RG3. During his three years in the NFL, Wilson has led his team to the Super Bowl twice, winning once.
Of course, one should not judge a trade using hindsight, and the Redskins were certainly unlucky that Griffen suffered from injuries. But that is part of the point. When you give up a bunch of top picks to select one player, you are putting all your eggs in his basket, and football players, like eggs, can be fragile.§
Our relationship with the Redskins did not last very long, but we soon found that another team (whose identity shall remain confidential) was interested in talking to us about draft strategy.

But given his disagreeable tendencies, Jobs was exactly the kind of person who could be confronted. Dubinsky knew that Jobs respected those who stood up to him and was open to new ways of doing things. And she wasn’t speaking up for herself; she was advocating for Apple.
By virtue of her willingness to challenge an idea she viewed as wrong, Dubinsky landed a promotion. She was not alone. Starting in 1981, the Macintosh team had begun granting an annual award to one person who challenged Jobs—and Jobs promoted every one of them to run a major division of Apple.
Comparing Carmen Medina’s and Donna Dubinsky’s experiences raises fundamental questions about the best way to handle dissatisfaction. In the quest for originality, neglect isn’t an option. Persistence is a temporary route to earning the right to speak up.

…

Although he was often exasperated by his procrastination, da Vinci realized that originality could not be rushed. He noted that people of “genius sometimes accomplish most when they work the least, for they are thinking out inventions and forming in their minds the perfect idea.”*
The Discipline to Delay
Procrastination turns out to be a common habit of creative thinkers and great problem solvers. Consider winners of the Science Talent Search, which is known as the “Super Bowl of Science” for high school seniors in the United States. A team led by psychologist Rena Subotnik interviewed these elite performers more than a decade after their victories, when they were in their early thirties, asking whether they procrastinated on routine and creative tasks, as well as in social life and health behavior. More than 68 percent admitted procrastinating in at least two of the four domains.

…

Thanks to globalization, social media, and rapid transportation and communication technologies, we have more mobility than ever before. Given these advantages, if you’re unhappy in your job and it’s easy to move, why pay the price of speaking up?
In Hirschman’s view, exit is bad for originality. But Donna Dubinsky’s experience casts exit in a different light. After winning the distribution battle at Apple, she landed a senior position in international sales and marketing at Claris, one of Apple’s software subsidiaries. Within a few years, her group accounted for half of all of Claris’s sales. When Apple refused to spin Claris off as an independent company in 1991, Dubinsky was so frustrated with the lack of opportunity for impact that she quit. She jetted to Paris for a yearlong sabbatical and took up painting, contemplating ways to contribute to a bigger mission. When she met an entrepreneur named Jeff Hawkins, she decided that his startup, Palm Computing, was the next big wave of technology, and accepted a position as CEO.

It’s scary how few people actually get that.” As Black Swan author Nassim Taleb put it in his suitably titled book, Fooled by Randomness, “Nowhere is the problem of induction more relevant than in the world of trading—and nowhere has it been as ignored!” Thus the occasional overzealous yet earnest public claim of economic prediction based on factors like women’s hemlines, men’s necktie width, Super Bowl results, and Christmas day snowfall in Boston.
The culprit that kills learning is overlearning (aka overfitting). Overlearning is the pitfall of mistaking noise for information, assuming too much about what has been shown within data. You’ve overlearned if you’ve read too much into the numbers, led astray from discovering the underlying truth.
Decision trees can overlearn like nobody’s business.

…

—HAL, the intelligent computer from 2001: A Space Odyssey (1968)
Science fiction almost always endows AI with the capacity to understand human tongues. Hollywood glamorizes a future in which we chat freely with the computer like a well-informed friend. In Star Trek IV: The Voyage Home (1986), our heroes travel back in time to a contemporary Earth and are confounded by its primitive technology. Our brilliant space engineer Scotty, attempting to make use of a Macintosh computer, is so accustomed to computers understanding the spoken word that he assumes its mouse must be a microphone. Patiently picking up the mouse as if it were a quaint artifact, he jovially beckons, “Hello, computer!”
2001: A Space Odyssey’s smart and talkative computer, HAL, bears a legendary, disputed connection in nomenclature to IBM (just take each letter back one position in the alphabet); however, author Arthur C.

The speaker was a senior leader of the US Department of Defense. The topic was why he thought cybersecurity and cyberwar was so important. And yet, when he could only describe the problem as “all this cyber stuff,” he unintentionally convinced us to write this book.
Both of us are in our thirties and yet still remember the first computers we used. For a five-year-old Allan, it was an early AppleMacintosh in his home in Pittsburgh. Its disk space was so limited that it could not even fit this book into its memory. For a seven-year-old Peter, it was a Commodore on display at a science museum in North Carolina. He took a class on how to “program,” learning an entire new language for the sole purpose of making one of the most important inventions in the history of mankind print out a smiley face.

…

An attacker could do this either by depriving users of a system that they depend on (such as how the loss of GPS would hamper military units in a conflict) or by merely threatening the loss of a system, known as a “ransomware” attack. Examples of such ransoms range from small-scale hacks on individual bank accounts all the way to global blackmail attempts against gambling websites before major sporting events like the World Cup and Super Bowl.
Beyond this classic CIA triangle of security, we believe it is important to add another property: resilience. Resilience is what allows a system to endure security threats instead of critically failing. A key to resilience is accepting the inevitability of threats and even limited failures in your defenses. It is about remaining operational with the understanding that attacks and incidents happen on a continuous basis.

…

This is the category that uses the type of ransomware attacks we read about earlier. The victim has to weigh the potential cost of fighting a well-organized attack versus paying off the potential attacker. Websites with time-dependent business models, such as seasonal sales, are particularly vulnerable. One study reported that, “In 2008, online casinos were threatened with just such an [extortion] attack, timed to disrupt their accepting wagers for the Super Bowl unless the attackers were paid 40,000 dollars.”
Of course, gambling itself is illegal in many jurisdictions, making it just one of many illicit activities that have extended into cyberspace. What makes these activities relevant to cybersecurity is their virtualization challenges territorial definitions. Some activities, such as the distribution of pedophilic images, are widely condemned around the world whether in a physical magazine or a website.

The morning after they arrived at the Valemont lodge, Wences, Briger, and the rest of the men climbed into a red-and-white Bell 212 helicopter sitting just outside the lodge and lifted off toward the high white peaks, for a day of heli-skiing. In the afternoon, the group returned to the lodge and sat around in the expansive common room, an enormous fire crackling away. This was not a crowd to chat about kids and the upcoming Super Bowl. The men had dedicated their lives to making money and Pete pressed them to present their best investment ideas.
“Pete, I told you, I’m interested in Bitcoin,” Wences said when his turn came to talk. “It hasn’t changed.”
Wences drew the group in with an explanation of the basic notion of a new kind of network that could allow people to move money anywhere in the world, instantaneously—something that these financiers, who were frequently moving millions between banks in different countries, could surely appreciate.

…

CHAPTER 4
April 2010
Laszlo Hanecz, a Hungarian-born twenty-eight-year-old software architect who lived in Florida, heard about Bitcoin from a programming friend he’d met on Internet relay chat, known as IRC. Assuming it was some scam, Laszlo poked around to figure out who was secretly making money. He soon realized there was an interesting and high-minded experiment going on and decided to explore further.
He began by buying some coins from NewLibertyStandard and then building software so that the Bitcoin code could run on a Macintosh. But like many good coders, Laszlo approached a new project with a hacker’s mind-set, probing where he might break it, in order to test its robustness. The obvious vulnerability here was the system for creating, or mining, Bitcoins. If a user threw a lot of computing power onto the network, he or she could win a disproportionate amount of the new Bitcoins. Although Satoshi Nakamoto had designed the mining process so that the hash function contest would become harder if computers were winning the mining race more frequently than every ten minutes, those users with the most powerful computers still had a much better chance of winning a majority of the coins.*
Until now, no one had an incentive to throw lots of computing power into mining, given that Bitcoins were worth essentially nothing.

Lewis avoided explaining to his young daughter why her friend’s father, a commercial real estate developer, had a license plate on his car that consisted of six letters: FU NCNB. When Lewis tried to join the Young Presidents Organization in Dallas, he was sponsored by Roger Staubach, the beloved ex-quarterback of the Dallas Cowboys. It didn’t matter. The city’s love for Staubach, a star player who had led its team to Super Bowl victories, was outweighed by its leaders’ visceral hatred of Lewis and his Charlotte colleagues. After two years of such vehement rejection—at a time when the bank’s retail business in the Lone Star State was booming—Lewis came close to snapping. He and his wife, Donna, were having dinner one evening at a fashionable restaurant when Lewis overheard a remark from a neighboring table about NCNB.

…

ONE TEAM, SHARED VALUES, SHARED FUTURE
1 But on January 15, word started leaking out: “Bank of America to Get Billions in U.S. Aid; Sides Finalizing Terms for Fresh Bailout Cash,” by Dan Fitzpatrick, Damian Paletta, and Susanne Craig, The Wall Street Journal, Jan. 15, 2009.
2 On this morning, he got no further than the Financial Times: “Merrill Delivered Bonuses Before BofA Deal,” by the author and Julie MacIntosh, Financial Times, Jan. 22, 2009.
CHAPTER 21. THE BOSTON MAFIA
1 not only had Lewis been consulting with the bank’s largest investors: “BofA Faces Pressure to Split Top Roles,” by Dan Fitzpatrick and Joann S. Lublin, The Wall Street Journal, April 17, 2009.
2 When Lewis returned from vacation after Labor Day: “In U.S. Regulators, Lewis Met His Match,” by Carrick Mollenkamp and Dan Fitzpatrick, The Wall Street Journal, Nov. 10, 2009.
3 But information started leaking to the media: “Bank of America Can’t Sign New CEO,” by Dan Fitzpatrick and Joanne S.

…

In an emergency situation such as the one involving Lehman, Paulson and Geithner had assumed that the British would be supportive and allow the Barclays acquisition of Lehman to proceed, since it would benefit capital markets around the world. But the British weren’t playing along.
Around 10:30, Paulson and Geithner broke the news to the Wall Street executives on the first floor. The group had been in the process of hammering out a $30 billion pool to support the Barclays deal, but the update from the regulators changed the dynamic in the room. Paulson returned upstairs, while Geithner urged the bankers to keep at it, in case the Barclays deal did come off. But he made it clear that Lehman was not going to be bailed out with federal money.
Kraus and Kelly kept running into executives from Goldman Sachs who wanted to know when the Merrill due diligence team was going to be ready to work.

"This is a Sony MFD-2DD microfloppy, double-sided, double-density, 135TPI, probably formatted for 800K. What's supposed to be on it?"
"We're not sure, but probably an encipherment algorithm."
"Ah! Russian communications systems? The Sovs getting sophisticated on us?"
"You don't need to know that," O'Day pointed out.
"You guys are no fun at all," the man said as he slid the disk into the drive. The computer to which it was attached was a new AppleMacintosh IIx, each of whose expander slots was occupied by a special circuit board, two of which the technician had personally designed. O'Day had heard that he'd work on an IBM only if someone put a gun to his head.
The programs he used for this task had been designed by other hackers to recover data from damaged disks. The first one was called Rescuedata. The operation was a delicate one. First the read heads mapped each magnetic zone on the disk, copying the data over to the eight-megabyte memory of the IIx and making a permanent copy on the hard drive, plus a floppy-disk copy.

…

"Going on my experience, not his, I'd say it's real gray, Dan. Davidoff's good - I mean, he's really good in front of a jury - but so's the defense guy, Stuart. The local DEA hates his guts, but he's an effective son of a bitch. The law is pretty muddled. What'll the judge say? Depends on the judge. What'll the jury say - depends on what the judge says and does. It's like putting a bet down on the next Super Bowl right now, before the season starts, and that doesn't even take into account what'll happen in the U.S. Court of Appeals after the trial's over in District Court. Whatever happens, the Coasties are going to get raped. Too bad. No matter what, Davidoff is going to tear each of 'em a new asshole for getting him into this mess."
"Warn 'em," Murray said. He told himself that it was an impulsive statement, but it wasn't.

…

Members of the senior executive service did not take vows of poverty and chastity, however - and obedience was also a sometime thing.
"I promised the American people that we'd do something about this problem," the President observed crossly. "And we haven't accomplished shit."
"Sir, you cannot deal with threats to national security through police agencies. Either our national security is threatened or it is not." Cutter had been hammering that point for years. Now, finally, he had a receptive audience.
Another grunt: "Yeah, well, I said that, too, didn't I?"
"Yes, Mr. President. It's time they learned a lesson about how the big boys play." That had been Cutter's position from the beginning, when he'd been Jeff Pelt's deputy, and with Pelt now gone it was his view that had finally prevailed.
"Okay, James. It's your ball. Run with it.