2009: Apple’s Macintosh celebrates 25th birthday

Share this:

Steve Jobs, photographed in 1984 at the annual Apple Computer show, then hosted at the Flint Center, in Cupertino. At Jobs' right is then-Apple CEO John Sculley, who is leaning on the Apple "Lisa" personal computer, which succeeded the original Macintosh, on which Jobs is leaning. The 1984 annual meeting was the predecessor of what would become the MacWorld show the next year. -- SLUG: jobs.

The Apple Macintosh, born at the hands of renegade engineers in the early 1980s, changed the relationship between human and keyboard.

The Mac, which turns 25 this week, has consistently been an industry pioneer of new technology, including the graphical user interface, speech, Wi-Fi and video.

“Apple redefined the computer beyond crunching ones and zeros. It made a technology lifestyle a reality,” said Jupiter Research analyst Michael Gartenberg.

More than two decades after the original Mac engineers toiled away in buildings flying pirate flags under the direction of Apple co-founder Steve Jobs, the Macintosh now sits at the center of Apple’s digital universe.

The company’s ability to seamlessly match hardware with software, such as its popular iLife photo and video programs, is unparalleled in the industry. As a result, Macs are gaining market share again. Last quarter, it garnered about 8 percent of the U.S. market, according to Gartner. Part of its growing popularity is tied to the re-engineering of its operating system to accommodate the trendsetting iPhone and Apple TV.

From the beginning, the Mac had a larger-than-life personality.

The company announced the Macintosh’s arrival in Ridley Scott’s watershed commercial that featured a lone runner defeating Big Brother — also known as IBM. It aired Jan. 22, 1984, during the third quarter of Super Bowl XVIII.

Two days later, Jobs, wearing a suit and bow tie, strode onto the stage at the Flint Center at De Anza College in Cupertino, recited a few verses from Bob Dylan’s “The Times They Are A-Changin’,” and then pulled the first Macintosh out of its canvass carrying case.

Jobs inserted a floppy disk, and the small computer began to speak: “Hello, I am Macintosh. It sure is great to get out of that bag! Unaccustomed as I am to public speaking, I’d like to share with you a maxim I thought of the first time I met an IBM mainframe: Never trust a computer that you can’t lift!”

“There wasn’t a person in the room who didn’t think this was history happening,” recalled Richard Doherty, analyst with the Envisioneering Group, who was there.

The Mac’s graphical user interface, or GUI, set the computer apart from everything else. Jobs discovered the GUI, developed at Xerox’s Palo Alto Research Center, in the late 1970s. The breakthrough made the computer much easier to master for the average person, who could simply click on a file to view it, rather than memorizing text-based commands. It was first used with the company’s 1983 Lisa personal computer, whose $9,995 price tag doomed its success.

The 1984 Macintosh, priced at $2,495, was the first computer to popularize the revolutionary new interface. It came with a mouse — one of the first computers to have one — a built-in monitor and a more durable hard-case floppy disk. Two applications, MacPaint, which allowed users to draw, and MacWrite, a basic word processing program, were bundled with the computer. Its compact design was a stark contrast to the big and boxy competing systems sold by IBM and others.

“The Macintosh was like this string hanging down from the future,” recalled longtime Silicon Valley technology forecaster Paul Saffo.

In a click of a mouse, the Mac opened up the world of personal computing to hundreds of millions of people.

“We had a feeling this new style of computer would be the way of the world,” said Apple co-founder Steve Wozniak, who at the time oversaw the Apple II, the company’s first mass-produced computer.

Jobs adopted Adobe’s Postscript, which enabled printers to replicate a computer screen’s text or graphics, into the Macintosh environment. A year later, Apple introduced the LaserWriter, one of the first laser printers. PageMaker software, created by Seattle-based Aldus, which eventually merged with Adobe, was released in July 1985.

“It was the birth of desktop publishing,” said Tim Bajarin, president of Creative Strategies.

The Mac’s creation occurred amid company turf wars and chaotic working conditions. In May 1985, Jobs was stripped of his day-to-day duties at Apple, and he resigned his position as chairman several months later. Soon after that, he founded NeXT, a new computer company in Redwood City.

For a time during Jobs’ 12-year absence, the Macintosh lost its innovative edge.

During the mid-1990s, the un-Microsoft computer teetered on irrelevancy, barely capturing 3 percent of the market. For a short time, Apple even licensed its operating system for Mac clones, something Jobs had rejected years earlier.

In 1996, Apple acquired NeXT to use its software to replace the Mac OS operating system. Jobs returned to Apple the following year. Shortly after the Second Coming of Jobs, the Mac revived the company from its near-death experience with the 1998 launch of the sleek all-in-one iMac desktop. Four years later, Jobs laid out a 10-year vision in which the Mac would be the center of a new digital lifestyle — music, video, networking.

Jobs always envisioned the computer as something more than a slab of technology.

During the spring of 1982, he took his entire team to see a collection of Louis Comfort Tiffany because, Jobs said at the time, he was an artist who learned how to mass produce his work, recalled Andy Hertzfeld, one of the main authors of the original Macintosh software whose reminiscences are at www.folklore.org.

When the first Macs rolled off the production line, each team member’s signature was engraved on the inside of the machine’s plastic case.

“I never thought the Mac would last even half as long as it has,” said Hertzfeld, who will gather privately this week with other original Macintosh creators to celebrate the computer’s birth.

“It did not evolve steadily — it was more like Stephen Jay Gould’s notion of ‘punctuated equilibrium’ where there are sudden great lurches forward, like switching processors or operating systems, followed by relatively calm periods,” said Hertzfeld, who now works at Google. “But even after all the major changes, the system somehow managed to retain its distinct personality.”