Monthly Archives: February 2020

Robyn Miller, one half of the pair of brothers who created the adventure game known as Myst with their small studio Cyan, tells a story about its development that’s irresistible to a writer like me. When the game was nearly finished, he says, its publisher Brøderbund insisted that it be put through “focus-group testing” at their offices. Robyn and his brother Rand reluctantly agreed, and soon the first group of guinea pigs shuffled into Brøderbund’s conference room. Much to its creators’ dismay, they hated the game. But then, just as the Miller brothers were wondering whether they had wasted the past two years of their lives making it, the second group came in. Their reaction was the exact opposite: they loved the game.

So would it be forevermore. Myst would prove to be one of the most polarizing games in history, loved and hated in equal measure. Even today, everyone seems to have a strong opinion about it, whether they’ve actually played it or not.

Myst‘s admirers are numerous enough to have made it the best-selling single adventure game in history, as well as the best-selling 1990s computer game of any type in terms of physical units shifted at retail: over 6 million boxed copies sold between its release in 1993 and the dawn of the new millennium. In the years immediately after its release, it was trumpeted at every level of the mainstream press as the herald of a new, dawning age of maturity and aesthetic sophistication in games. Then, by the end of the decade, it was lamented as a symbol of what games might have become, if only the culture of gaming had chosen it rather than the near-simultaneously-released Doom as its model for the future. Whatever the merits of that argument, the hardcore Myst lovers remained numerous enough in later years to support five sequels, a series of novels, a tabletop role-playing game, and multiple remakes and remasters of the work which began it all. Their passion was such that, when Cyan gave up on an attempt to turn Myst into a massively-multiplayer game, the fans stepped in to set up their own servers and keep it alive themselves.

And yet, for all the love it’s inspired, the game’s detractors are if anything even more committed than its proponents. For a huge swath of gamers, Myst has become the poster child for a certain species of boring, minimally interactive snooze-fest created by people who have no business making games — and, runs the spoken or unspoken corollary, played by people who have no business playing them. Much of this vitriol comes from the crowd who hate any game that isn’t violent and visceral on principle.

But the more interesting and perhaps telling brand of hatred comes from self-acknowledged fans of the adventure-game genre. These folks were usually raised on the Sierra and LucasArts traditions of third-person adventures — games that were filled with other characters to interact with, objects to pick up and carry around and use to solve puzzles, and complicated plot arcs unfolding chapter by chapter. They have a decided aversion to the first-person, minimalist, deserted, austere Myst, sometimes going so far as to say that it isn’t really an adventure game at all. But, however they categorize it, they’re happy to credit it with all but killing the adventure genre dead by the end of the 1990s. Myst, so this narrative goes, prompted dozens of studios to abandon storytelling and characters in favor of yet more sterile, hermetically sealed worlds just like its. And when the people understandably rejected this airless vision, that was that for the adventure game writ large. Some of the hatred directed toward Myst by stalwart adventure fans — not only fans of third-person graphic adventures, but, going even further back, fans of text adventures — reaches an almost poetic fever pitch. A personal favorite of mine is the description deployed by Michael Bywater, who in previous lives was himself an author of textual interactive fiction. Myst, he says, is just “a post-hippie HyperCard stack with a rather good music loop.”

After listening to the cultural dialog — or shouting match! — which has so long surrounded Myst, one’s first encounter with the actual artifact that spurred it all can be more than a little anticlimactic. Seen strictly as a computer game, Myst is… okay. Maybe even pretty good. It strikes this critic at least as far from the best or worst game of its year, much less of its decade, still less of all gaming history. Its imagery is well-composited and occasionally striking, its sound and music design equally apt. The sense of desolate, immersive beauty it all conveys can be strangely affecting, and it’s married to puzzle-design instincts that are reasonable and fair. Myst‘s reputation in some quarters as impossible, illogical, or essentially unplayable is unearned; apart from some pixel hunts and perhaps the one extended maze, there’s little to really complain about on that front. On the contrary: there’s a definite logic to its mechanical puzzles, and figuring out how its machinery works through trial and error and careful note-taking, then putting your deductions into practice, is genuinely rewarding, assuming you enjoy that sort of thing.

At same time, though, there’s just not a whole lot of there there. Certainly there’s no deeper meaning to be found; Myst never tries to be about more than exploring a striking environment and solving intricate puzzles. “When we started, we wanted to make a [thematic] statement, but the project was so big and took so much effort that we didn’t have the energy or time to put much into that part of it,” admits Robyn Miller. “So, we decided to just make a neat world, a neat adventure, and say important things another time.” And indeed, a “neat world” and “neat adventure” are fine ways of describing Myst.

Depending on your preconceptions going in, actually playing Myst for the first time is like going to meet your savior or the antichrist, only to find a pleasant middle-aged fellow who offers to pour you a cup of tea. It’s at this point that the questions begin. Why does such an inoffensive game offend so many people? Why did such a quietly non-controversial game become such a magnet for controversy? And the biggest question of all: why did such a simple little game, made by five people using only off-the-shelf consumer software, become one of the most (in)famous money spinners in the history of the computer-games industry?

We may not be able to answers all of these whys to our complete satisfaction; much of the story of Myst surely comes down to sheer happenstance, to the proverbial butterfly flapping its wings somewhere on the other side of the world. But we can at least do a reasonably good job with the whats and hows of Myst. So, let’s consider now what brought Myst about and how it became the unlikely success it did. After that, we can return once again to its proponents and its detractors, and try to split the difference between Myst as gaming’s savior and Myst as gaming’s antichrist.

Rand Miller

Robyn Miller

If nothing else, the origin story of Myst is enough to make one believe in karma. As I wrote in an earlier article, the Miller brothers and their company Cyan came out of the creative explosion which followed Apple’s 1987 release of HyperCard, a unique Macintosh authoring system which let countless people just like them experiment for the first time with interactive multimedia and hypertext. Cyan’s first finished project was The Manhole. Published in November of 1988 by Mediagenic, it was a goal-less software toy aimed at children, a virtual fairy-tale world to explore. Six months later, Mediagenic added music and sound effects and released it on CD-ROM, marking the first entertainment product ever to appear on that medium. The next couple of years brought two more interactive explorations for children from Cyan, published on floppy disk and CD-ROM.

Even as these were being published, however, the wheels were gradually coming off of Mediagenic, thanks to a massive patent-infringement lawsuit they lost to the Dutch electronics giant Philips and a whole string of other poor decisions and unfortunate events. In February of 1991, a young bright spark named Bobby Kotick seized Mediagenic in a hostile takeover, reverting the company to its older name of Activision. By this point, the Miller brothers were getting tired of making whimsical children’s toys; they were itching to make a real game, with a goal and puzzles. But when they asked Activision’s new management for permission to do so, they were ordered to “keep doing what you’ve been doing.” Shortly thereafter, Kotick announced that he was taking Activision into Chapter 11 bankruptcy. After he did so, Activision simply stopped paying Cyan the royalties on which they depended. The Miller brothers were lost at sea, with no income stream and no relationships with any other publishers.

But at the last minute, they were thrown an unexpected lifeline. Lo and behold, the Japanese publisher Sunsoft came along offering to pay Cyan $265,000 to make a CD-ROM-based adult adventure game in the same general style as their children’s creations — i.e., exactly what the Miller brothers had recently asked Activision for permission to do. Sunsoft was convinced that there would be major potential for such a game on the upcoming generation of CD-ROM-based videogame consoles and multimedia set-top boxes for the living room — so convinced, in fact, that they were willing to fund the development of the game on the Macintosh and take on the job of porting it to these non-computer platforms themselves, all whilst signing over the rights to the computer version(s) to Cyan for free. The Miller brothers, reduced by this point to a diet of “rice and beans and government cheese,” as Robyn puts it, knew deliverance when they saw it. They couldn’t sign the contract fast enough. Meanwhile Activision had just lost out on the chance to release what would turn out to be one of the games of the decade.

But of course the folks at Cyan were as blissfully unaware of that future as those at Activision. They simply breathed sighs of relief and started making their game. In time, Cyan signed a contract with Brøderbund to release the computer versions of their game, starting with the Macintosh original.

Myst certainly didn’t begin as any conscious attempt to re-imagine the adventure-game form. Those who later insisted on seeing it in almost ideological terms, as a sort of artistic manifesto, were often shocked when they first met the Miller brothers in person. This pair of plain-spoken, baseball-cap-wearing country boys were anything but ideologues, much less stereotypical artistes. Instead they seemed a perfect match for the environs in which they worked: an unassuming two-story garage in Spokane, Washington, far from any centers of culture or technology. Their game’s unique personality actually stemmed from two random happenstances rather than any messianic fervor.

One of these was — to put it bluntly — their sheer ignorance. Working on the minority platform that was the Macintosh, specializing up to this point in idiosyncratic children’s software, the Miller brothers were oddly disengaged from the computer-games industry whose story I’ve been telling in so many other articles here. By their own account, they had literally never even seen any of the contemporary adventure games from companies like LucasArts and Sierra before making Myst. In fact, Robyn Miller says today that he had only played one computer game in his life to that point: Infocom’s ten-year-old Zork II. Rand Miller, being the older brother, the first mover behind their endeavors, and the more technically adept of the pair, was perhaps a bit more plugged-in, but only a bit.

The other circumstance which shaped Myst was the technology employed to create it. This statement is true of any game, but it becomes even more salient here because the technology in question was so different from that employed by other adventure creators. Myst is indeed simply a HyperCard stack — the “hippie-dippy” is in the eye of the beholder — gluing together pictures generated by the 3D modeler StrataVision. During the second half of its development, a third everyday Macintosh software package made its mark: Apple’s QuickTime video system, which allowed Myst‘s creators to insert snippets of themselves playing the roles of the people who previously visited the semi-ruined worlds you spend the game exploring. All of these tools are presentation-level tools, not conventional game-building ones. Seen in this light, it’s little surprise that so much of Myst is surface. At bottom, it’s a giant hypertext done in pictures, with very little in the way of systems of any sort behind it, much less any pretense of world simulation. You wander through its nodes, in some of which you can click on something, which causes some arbitrary event to happen. The one place where the production does interest itself in a state which exists behind its visuals is in the handful of mechanical devices found scattered over each of its landscapes, whose repair and/or manipulation form the basis of the puzzles that turn Myst into a game rather than an unusually immersive slideshow.

In making Myst, each brother fell into the role he was used to from Cyan’s children’s projects. The brothers together came up with the story and world design, then Robyn went off to do the art and music while Rand did the technical plumbing in HyperCard. One Chuck Carter helped Robyn on the art side and Rich Watson helped Rand on the programming side, while Chris Brandkamp produced the intriguing, evocative environmental soundscape by all sorts of improvised means: banging a wrench against the wall or blowing bubbles in a toilet bowl, then manipulating the samples to yield something appropriately other-worldly. And that was the entire team. It was a shoestring operation, amateurish in the best sense. The only thing that distinguished the boys at Cyan from a hundred thousand other hobbyists playing with the latest creative tools on their own Macs was the fact that Cyan had a contract to do so — and a commensurate quantity of real, raw talent, of course.

Ironically given that Myst was treated as such a cutting-edge product at the time of its release, in terms of design it’s something of a throwback — a fact that does become less surprising when one considers that its creators’ experience with adventure games stopped in the early 1980s. A raging debate had once taken place in adventure circles over whether the ideal protagonist should be a blank slate, imprintable by the player herself, or a fully-fleshed-out role for the player to inhabit. The verdict had largely come down on the side of the latter as games’ plots had grown more ambitious, but the whole discussion had passed the Miller brothers by.

So, with Myst we were back to the old “nameless, faceless adventurer” paradigm which Sierra and LucasArts had long since abandoned. Myst actively encourages you to think of it as yourself there in its world. The story begins when you open a mysterious book here on our world, whereupon you get sucked into an alternate dimension and find yourself standing on the dock of a deserted island. You soon learn that you’re following a trail first blazed by a father and his two sons, all of whom had the ability to hop about between dimensions — or “ages,” as the game calls them — and alter them to their will. Unfortunately, the father is now said to be dead, while the two brothers have each been trapped in a separate interdimensional limbo, each blaming the other for their father’s death. (These themes of sibling rivalry have caused much comment over the years, especially in light of the fact that each brother in the game is played by one of the real Miller brothers. But said real brothers have always insisted that there are no deeper meanings to be gleaned here…)

You can access four more worlds from the central island just as soon as you solve the requisite puzzles. In each of them, you must find a page of a magical book. Putting the pages together, along with a fifth page found on the central island, allows you to free the brother of your choice, or to do… something else, which actually leads to the best ending. This last-minute branch to an otherwise unmalleable story is a technique we see in a fair number of other adventure games wishing to make a claim to the status of genuinely interactive fictions. (In practice, of course, players of those games and Myst alike simply save before the final choice and check out all of the endings.)

For all its emphasis on visuals, Myst is designed much like a vintage text adventure in many ways. Even setting aside its explicit maze, its network of discrete, mostly empty locations resembles the map from an old-school text adventure, where navigation is half the challenge. Similarly, its complex environmental puzzles, where something done in one location may have an effect on the other side of the map, smacks of one of Infocom’s more cerebral, austere games, such as Zork III or Spellbreaker.

This is not to say that Myst is a conscious throwback; the nature of the puzzles, like so much else about the game, is as much determined by the Miller brothers’ ignorance of contemporary trends in adventure design as by the technical constraints under which they labored. Among the latter was the impossibility of even letting the player pick things up and carry them around to use elsewhere. Utterly unfazed, Rand Miller coined an aphorism: “Turn your problems into features.” Thus Myst‘s many vaguely steam-punky mechanical puzzles, all switches to throw and ponderous wheels to set in motion, are dictated as much by its designers’ inability to implement a player inventory as by their acknowledged love for Jules Verne.

And yet, whatever the technological determinism that spawned it, this style of puzzle design truly was a breath of fresh air for gamers who had grown tired of the “use this object on that hotspot” puzzles of Sierra and LucasArts. To their eternal credit, the Miller brothers took this aspect of the design very seriously, giving their puzzles far more thought than Sierra at least tended to do. They went into Myst with no experience designing puzzles, and their insecurity about this aspect of their craft was perhaps their ironic saving grace. Before they even had a computer game to show people, they spent hours walking outsiders through their scenario Dungeons & Dragons-style, telling them what they saw and listening to how they tried to progress. And once they did have a working world on the computer, they spent more hours sitting behind players, watching what they did. Robyn Miller, asked in an interview shortly after the game’s release whether there was anything he “hated,” summed up thusly their commitment to consistent, logical puzzle design and world-building (in Myst, the two are largely one and the same):

Seriously, we hate stuff without integrity. Supposed “art” that lacks attention to detail. That bothers me a lot. Done by people who are forced into doing it or who are doing it for formula reasons and monetary reasons. It’s great to see something that has integrity. It makes you feel good. The opposite of that is something I dislike.

We tried to create something — a fantastic world — in a very realistic way. Creating a fantasy world in an unrealistic way is the worst type of fantasy. In Jurassic Park, the idea of dinosaurs coming to life in the twentieth century is great. But it works in that movie because they also made it believable. That’s how the idea and the execution of that idea mix to create a truly great experience.

Taken as a whole, Myst is a master class in designing around constraints. Plenty of games have been ruined by designers whose reach exceeded their core technology’s grasp. We can see this phenomenon as far back as the time of Scott Adams: his earliest text adventures were compact marvels, but quickly spiraled into insoluble incoherence when he started pushing beyond what his simplistic parsers and world models could realistically present. Myst, then, is an artwork of the possible. Managing inventory, with the need for a separate inventory screen and all the complexities of coding this portable object interacting on that other thing in the world, would have stretched HyperCard past the breaking point. So, it’s gone. Interactive conversations would have been similarly prohibitive with the technology at the Millers’ fingertips. So, they devised a clever dodge, showing the few characters that exist only as recordings, or through one-way screens where you can see them, but they can’t see (or hear) you; that way, a single QuickTime video clip is enough to do the trick. In paring things back so dramatically, the Millers wound up with an adventure game unlike any that had been seen before. Their problems really did become their game’s features.

For the most part, anyway. The networks of nodes and pre-rendered static views that constitute the worlds of Myst can be needlessly frustrating to navigate, thanks to the way that the views prioritize aesthetics over consistency; rotating your view in place sometimes turns you 90 degrees, sometimes 180 degrees, sometimes somewhere in between, according to what the designers believed would provide the most striking image. Orienting yourself and moving about the landscape can thus be a confusing process. One might complain as well that it’s a slow one, what with all the empty nodes which you must move through to get pretty much anywhere — often just to see if something you’ve done on one side of the map has had any effect on something on its other side. Again, a comparison with the twisty little passages of an old-school text adventure, filled with mostly empty rooms, does strike me as thoroughly apt.

On the other hand, a certain glaciality of pacing seems part and parcel of what Myst fundamentally is. This is not a game for the impatient. It’s rather targeted at two broad types of player: the aesthete, who will be content just to wander the landscape taking in the views, perhaps turning to a walkthrough to be able to see all of the worlds; and the dedicated puzzle solver, willing to pull out paper and pencil and really dig into the task of understanding how all this strange machinery hangs together. Both groups have expressed their love for Myst over the years, albeit in terms which could almost convince you they’re talking about two entirely separate games.

So much for Myst the artifact. What of Myst the cultural phenomenon?

The origins of the latter can be traced to the Miller brothers’ wise decision to take their game to Brøderbund. Brøderbund tended to publish fewer products per year than their peers at Electronic Arts, Sierra, or the lost and unlamented Mediagenic, but they were masterful curators, with a talent for spotting software which ordinary Americans might want to buy and then packaging and marketing it perfectly to reach them. (Their insistence on focus testing, so confusing to the Millers, is proof of their competence; it’s hard to imagine any other publisher of the time even thinking of such a thing.) Brøderbund published a string of products over the course of a decade or more which became more than just hits; they became cultural icons of their time, getting significant attention in the mainstream press in addition to the computer magazines: The Print Shop, Carmen Sandiego, Lode Runner, Prince of Persia, SimCity. And now Myst was about to become the capstone to a rather extraordinary decade, their most successful and iconic release of all.

Brøderbund first published the game on the Macintosh in September of 1993, where it was greeted with rave reviews. Not a lot of games originated on the Mac at all, so a new and compelling one was always a big event. Mac users tended to conceive of themselves as the sophisticates of the computer world, wearing their minority status as a badge of pride. Myst hit the mark beautifully here; it was the Mac-iest of Mac games. MacWorld magazine’s review is a rather hilarious example of a homer call. “It’s been polished until it shines,” wrote the magazine. Then, in the next paragraph: “We did encounter a couple of glitches and frozen screens.” Oh, well.

Helped along by press like this, Myst came out of the gates strong. By one report, it sold 200,000 copies on the Macintosh alone in its first six months. If correct or even close to correct, those numbers are extraordinary; they’re the numbers of a hit even on the gaming Mecca that was the Wintel world, much less on the Mac, with its vastly smaller user base.

Still, Brøderbund knew that Myst‘s real opportunity lay with those selfsame plebeian Wintel machines which most Mac users, the Miller brothers included, disdained. Just as soon as Cyan delivered the Mac version, Brøderbund set up an internal team — larger than the Cyan team which had made the game in the first place — to do the port as quickly as possible. Importantly, Myst was ported not to bare MS-DOS, where almost all “hardcore” games still resided, but to Windows, where the new demographics which Brøderbund hoped to attract spent all of their time. Luckily, the game’s slideshow visuals were possible even under Windows’s sluggish graphics libraries, and Apple had recently ported their QuickTime video system to Microsoft’s platform. The Windows version of Myst shipped in March of 1994.

And now Brøderbund’s marketing got going in earnest, pushing the game as the one showcase product which every purchaser of a new multimedia PC simply had to have. At the time, most CD-ROM based games also shipped in a less impressive floppy-disk-based version, with the latter often still outselling the former. But Brøderbund and Cyan made the brave choice not to attempt a floppy-disk version at all. The gamble paid off beautifully, furthering the carefully cultivated aspirational quality which already clung to Myst, now billed as the game which simply couldn’t be done on floppy disk. Brøderbund’s lush advertisements had a refined, adult air about them which made them stand out from the dragons, spaceships, and scantily-clad babes that constituted the usual motifs of game advertising. As the crowning touch, Brøderbund devised a slick tagline: Myst was “the surrealistic adventure that will become your world.” The Miller brothers scoffed at this piece of marketing-speak — until they saw how Myst was flying off the shelves in the wake of it.

So, through a combination of lucky timing and precision marketing, Myst blew up huge. I say this not to diminish its merits as a puzzle-solving adventure game, which are substantial, but simply because I don’t believe those merits were terribly relevant to the vast majority of people who purchased it. A parallel can be drawn with Infocom’s game of Zork, which similarly surfed a techno-cultural wave a decade before Myst. It was on the scene just as home computers were first being promoted in the American media as the logical, more permanent successors to the videogame-console fad. For a time, Zork, with its ability to parse pseudo-natural-English sentences, was seen by computer salespeople as the best overall demonstration of what a computer could do; they therefore showed it to their customers as a matter of course. And so, when countless new computer systems went home with their new owners, there was also a copy of Zork in the bag. The result was Infocom’s best-selling game of all time, to the tune of almost 400,000 copies sold.

Myst now played the same role in a new home-computer boom. The difference was that, while the first boom had fizzled rather quickly when people realized of what limited practical utility those early machines actually were, this second boom would be a far more sustained affair. In fact, it would become the most sustained boom in the history of the consumer PC, stretching from approximately 1993 right through the balance of the decade, with every year breaking the sales records set by the previous one. The implications for Myst, which arrived just as the boom was beginning, were titanic. Even long after it ceased to be particularly cutting-edge, it continued to be regarded as an essential accessory for every PC, to be tossed into the bags carried home from computer stores by people who would never buy another game.

Myst had already established its status by the time the hype over the World Wide Web and Windows 95 really lit a fire under computer sales in 1995. It passed the 1 million copy mark in the spring of that year. By the same point, a quickie “strategy guide” published by Prima, ideal for the many players who just wanted to take in its sights without worrying about its puzzles, had passed an extraordinary 300,000 copies sold — thus making its co-authors, who’d spent all of three weeks working on it, the two luckiest walkthrough authors in history. Defying all of the games industry’s usual logic, which dictated that titles sold in big numbers for only a few months before fizzling out, Myst‘s sales just kept accelerating from there. It sold 850,000 copies in 1996 in the United States alone, then another 870,000 copies in 1997. Only in 1998 did it finally begin to flag, posting domestic sales of just 540,000 copies. Fortunately, the European market for multimedia PCs, which lagged a few years behind the American one, was now also burning bright, opening up whole new frontiers for Myst. Its total retail sales topped 6 million by 2000, at least 2 million of them outside of North America. Still more copies — it’s impossible to say how many — had shipped as pack-in bonuses with multimedia upgrade kits and the like. Meanwhile, under the terms of Sunsoft’s original agreement with Cyan, it was also ported by the former to the Sega Saturn, Atari Jaguar, 3DO, and CD-I living-room consoles. Myst was so successful that another publisher came out with an elaborate parody of it as a full-fledged computer game in its own right, under the indelible title of Pyst. Considering that it featured the popular sitcom star John Goodman, Pyst must have cost far more to make than the shoestring production it mocked.

As we look at the staggering scale of Myst‘s success, we can’t avoid returning to that vexing question of why it all should have come to be. Yes, Brøderbund’s marketing campaign was brilliant, but there must be more to it than that. Certainly we’re far from the first to wonder about it all. As early as December of 1994, Newsweek magazine noted that “in the gimmick-dominated world of computer games, Myst should be the equivalent of an art film, destined to gather critical acclaim and then dust on the shelves.” So why was it selling better than guaranteed crowd-pleasers with names like Star Wars on their boxes?

It’s not that it’s that difficult to pinpoint some of the other reasons why Myst should have been reasonably successful. It was a good-looking game that took full advantage of CD-ROM, at a time when many computer users — non-gamers almost as much as gamers — were eager for such things to demonstrate the power of their new multimedia wundermachines. And its distribution medium undoubtedly helped its sales in another way: in this time before CD burners became commonplace, it was immune to the piracy that many publishers claimed was costing them at least half their sales of floppy-disk-based games.

Likewise, a possible explanation for Myst‘s longevity after it was no longer so cutting-edge might be the specific technological and aesthetic choices made by the Miller brothers. Many other products of the first gush of the CD-ROM revolution came to look painfully, irredeemably tacky just a couple of years after they had dazzled, thanks to their reliance on grainy video clips of terrible actors chewing up green-screened scenery. While Myst did make some use of this type of “full-motion video,” it was much more restrained in this respect than many of its competitors. As a result, it aged much better. By the end of the 1990s, its graphics resolution and color count might have been a bit lower than those of the latest games, and it might not have been quite as stunning at first glance as it once had been, but it remained an elegant, visually-appealing experience on the whole.

Yet even these proximate causes don’t come close to providing a full explanation of why this art film in game form sold like a blockbuster. There are plenty of other games of equal or even greater overall merit to which they apply equally well, but none of them sold in excess of 6 million copies. Perhaps all we can do in the end is chalk it up to the inexplicable vagaries of chance. Computer sellers and buyers, it seems, needed a go-to game to show what was possible when CD-ROM was combined with decent graphics and sound cards. Myst was lucky enough to become that game. Although its puzzles were complex, simply taking in its scenery was disarmingly simple, making it perfect for the role. The perfect product at the perfect time, perfectly marketed.

In a sense, Myst the phenomenon didn’t do that other Myst — Myst the actual artifact, the game we can still play today — any favors at all. The latter seems destined always to be judged in relation to the former, and destined always to be found lacking. Demanding that what is in reality a well-designed, aesthetically pleasing game live up to the earth-shaking standards implied by Myst‘s sales numbers is unfair on the face of it; it wasn’t the fault of the Miller brothers, humble craftsmen with the right attitude toward their work, that said work wound up selling 6 million copies. Nevertheless, we feel compelled to judge it, at least to some extent, with the knowledge of its commercial and cultural significance firmly in mind. And in this context especially, some of its detractors’ claims do have a ring of truth.

Arguably the truthiest of all of them is the oft-repeated old saw that no other game was bought by so many people and yet really, seriously played by so few of its purchasers. While such a hyperbolic claim is impossible to truly verify, there is a considerable amount of circumstantial evidence pointing in exactly that direction. The exceptional sales of the strategy guide are perhaps a wash; they can be as easily ascribed to serious players wanting to really dig into the game as they can to casual purchasers just wanting to see all the pretty pictures on the CD-ROM. Other factors, however, are harder to dismiss. The fact is, Myst is hard by casual-game standards — so hard that Brøderbund included a blank pad of paper in the box for the purpose of keeping notes. If we believe that all or most of its buyers made serious use of that notepad, we have to ask where these millions of people interested in such a cerebral, austere, logical experience were before it materialized, and where they went thereafter. Even the Miller brothers themselves — hardly an unbiased jury — admit that by their best estimates no more than 50 percent of the people who bought Myst ever got beyond the starting island. Personally, I tend to suspect that the number is much lower than that.

Perhaps the most telling evidence for Myst as the game which everyone had but hardly anyone played is found in a comparison with one of its contemporaries: id Software’s Doom, the other decade-dominating blockbuster of 1993 (a game about which I’ll be writing much more in a future article). Doom indisputably was played, and played extensively. While it wasn’t quite the first running-around-and-shooting-things-from-a-first-person-perspective game, it did became so popular that games of its type were codified as a new genre unto themselves. The first-person shooters which followed Doom in the 1990s were among the most popular games of their era. Many of their titles are known to gamers today who weren’t yet born when they debuted: titles like Duke Nukem 3D, Quake, Half-Life, Unreal. Myst prompted just as many copycats, but these were markedly less popular and are markedly less remembered today: AMBER: Journeys Beyond, Zork Nemesis, Rama, Obsidian. Only Cyan’s own eventual sequel to Myst can be found among the decade’s bestsellers, and even it’s a definite case of diminishing commercial returns, despite being a rather brilliant game in its own right. In short, any game which sold as well as Myst, and which was seriously played by a proportionate number of people, ought to have left a bigger imprint on ludic culture than this one did.

But none of this should affect your decision about whether to play Myst today, assuming you haven’t yet gotten around to it. Stripped of all its weighty historical context, it’s a fine little adventure game if not an earth-shattering one, intriguing for anyone with the puzzle-solving gene, infuriating for anyone without it. You know what I mean… sort of a niche experience. One that just happened to sell 6 million copies.

(Sources: the books Myst: Prima’s Official Strategy Guide by Rick Barba and Rusel DeMaria, Myst & Riven: The World of the D’ni by Mark J.P. Wolf, and The Secret History of Mac Gaming by Richard Moss; Computer Gaming World of December 1993; MacWorld of March 1994; CD-ROM Today of Winter 1993. Online sources include “Two Histories of Myst” by John-Gabriel Adkins, Ars Technica‘s interview with Rand Miller, Ryan Miller’s postmortem of Myst at the 2013 Game Developers Conference, GameSpot‘s old piece on Myst as one of the “15 Most Influential Games of All Time,” and Greg Lindsay’s Salon column on Myst as a “dead end.” Michael Bywater’s colorful comments about Myst come from Peter Verdi’s now-defunct Magnetic Scrolls fan site, a dump of which Stefan Meier dug up for me from his hard drive several years ago. Thanks again, Stefan!

+

= ?

I think [the] Macintosh accomplished everything we set out to do and more, even though it reaches most people these days as Windows.

— Andy Hertzfeld (original Apple Macintosh systems programmer), 1994

When rumors first began to circulate early in 1991 that IBM and Apple were involved in high-level talks about a major joint initiative, most people dismissed them outright. It was, after all, hard to imagine two companies in the same industry with more diametrically opposed corporate cultures. IBM was Big Blue, a bedrock of American business since the 1920s. Conservative and pragamatic to a fault, it was a Brylcreemed bastion of tradition where casual days meant that employees might remove their jackets to reveal the starched white shirts they wore underneath. Apple, on the other hand, had been founded just fifteen years before by two long-haired children of the counterculture, and its campus still looked more like Woodstock than Wall Street. IBM placed great stock in the character of its workforce; Apple, as journalist Michael S. Malone would later put it in his delightfully arch book Infinite Loop, “seemed to have no character, but only an attitude, a style, a collection of mannerisms.” IBM talked about enterprise integration and system interoperability; Apple prattled on endlessly about changing the world. IBM played Lawrence Welk at corporate get-togethers; Apple preferred the Beatles. (It was an open secret that the name the company shared with the Beatles’ old record label wasn’t coincidental.)

Unsurprisingly, the two companies didn’t like each other very much. Apple in particular had been self-consciously defining itself for years as the sworn enemy of IBM and everything it represented. When Apple had greeted the belated arrival of the IBM PC in 1981 with a full-page magazine advertisement bidding Big Blue “welcome, seriously,” it had been hard to read as anything other than snarky sarcasm. And then, and most famously, had come the “1984” television advertisement to mark the debut of the Macintosh, in which Apple was personified as a hammer-throwing freedom fighter toppling a totalitarian corporate titan — Big Blue recast as Big Brother. What would the rumor-mongers be saying next? That cats would lie down with dogs? That the Russians would tell the Americans they’d given up on the whole communism thing and would like to be friends… oh, wait. It was a strange moment in history. Why not this too, then?

Indeed, when one looked a little harder, a partnership began to make at least a certain degree of sense. Apple’s rhetoric had actually softened considerably since those heady early days of the Macintosh and the acrimonious departure of Steve Jobs which had marked their ending. In the time since, more sober minds at the company had come to realize that insulting conservative corporate customers with money to spend on Apple’s pricey hardware might be counter-productive. Most of all, though, both companies found themselves in strikingly similar binds as the 1990s got underway. After soaring to rarefied heights during the early and middle years of the previous decade, they were now being judged by an increasing number of pundits as the two biggest losers of the last few years of computing history. In the face of the juggernaut that was Microsoft Windows, that irresistible force which nothing in the world of computing could seem to defy for long, it didn’t seem totally out of line to ask whether there even was a future for IBM or Apple. Seen in this light, the pithy clichés practically wrote themselves: “the enemy of my enemy is my friend”; “any port in a storm”; etc. Other, somewhat less generous commentators just talked about an alliance of losers.

Each of the two losers had gotten to this juncture by a uniquely circuitous route.

When IBM released the IBM PC, their first mass-market microcomputer, in August of 1981, they were as surprised as anyone by the way it took off. Even as hackers dismissed it as boring and unimaginative, corporate America couldn’t get enough of the thing; a boring and unimaginative personal computer — i.e., a safe one — was exactly what they had been waiting for. IBM’s profits skyrocketed during the next several years, and the pundits lined up to praise the management of this old, enormous company for having the flexibility and wherewithal to capitalize on an emerging new market; a tap-dancing elephant became the metaphor of choice.

And yet, like so many great successes, the IBM PC bore the seeds of its downfall within it from the start. It was a simple, robust machine, easy to duplicate by plugging together readily available commodity components — a process made even easier by IBM’s commitment to scrupulously documenting every last detail of its design for all and sundry. Further, IBM had made the mistake of licensing its operating system from a small company known as Microsoft rather than buying it outright or writing one of their own, and Bill Gates, Microsoft’s Machiavellian CEO, proved more than happy to license MS-DOS to anyone else who wanted it as well. The danger signs could already be seen in 1982, when an upstart company called Compaq released a “portable” version of IBM’s computer — in those days, this meant a computer which could be packed into a single suitcase — before IBM themselves could get around to it. A more dramatic tipping point arrived in 1986, when the same company made a PC clone built around Intel’s hot new 80386 CPU before IBM managed to do so.

In 1987, IBM responded to the multiplying ranks of the clone makers by introducing the PS/2 line, which came complete with a new, proprietary bus architecture, locked up tight this time inside a cage of patents and legalese. A cynical move on the face of it, it backfired spectacularly in practice. Smelling the overweening corporate arrogance positively billowing out of the PS/2 lineup, many began to ask themselves for the first time whether the industry still needed IBM at all. And the answer they often came to was not the one IBM would have preferred. IBM’s new bus architecture slowly died on the vine, while the erstwhile clone makers put together committees to define new standards of their own which evolved the design IBM had originated in more open, commonsense ways. In short, IBM lost control of the very platform they had created. By 1990, the words “PC clone” were falling out of common usage, to be replaced by talk of the “Wintel Standard.” The new standard bearer, the closest equivalent to IBM in this new world order, was Microsoft, who continued to license MS-DOS and Windows, the software that allowed all of these machines from all of these diverse manufacturers to run the same applications, to anyone willing to pay for it. Meanwhile OS/2, IBM’s mostly-compatible alternative operating system, was struggling mightily; it would never manage to cross the hump into true mass-market acceptance.

Apple’s fall from grace had been less dizzying in some ways, but the position it had left them in was almost as frustrating.

After Steve Jobs walked away from Apple in September of 1985, leaving behind the Macintosh, his twenty-month-old dream machine, the more sober-minded caretakers who succeeded him did many of the reasonable, sober-minded things which their dogmatic predecessor had refused to allow: opening the Mac up for expansion, adding much-requested arrow keys to its keyboard, toning down the revolutionary rhetoric that spooked corporate America so badly. These things, combined with the Apple LaserWriter laser printer, Aldus PageMaker software, and the desktop-publishing niche they spawned between them, saved the odd little machine from oblivion. Yet something did seem to get lost in the process. Although the Mac remained a paragon of vision in computing in many ways — HyperCard alone proved that! — Apple’s management could sometimes seem more interested in competing head-to-head with PC clones for space on the desks of secretaries than nurturing the original dream of the Macintosh as the creative, friendly, fun personal computer for the rest of us.

In fact, this period of Apple’s history must strike anyone familiar with the company of today — or, for that matter, with the company that existed before Steve Jobs’s departure — as just plain weird. Quibbles about character versus attitude aside, Apple’s most notable strength down through the years has been a peerless sense of self, which they have used to carve out their own uniquely stylish image in the ofttimes bland world of computing. How odd, then, to see the Apple of this period almost willfully trying to become the one thing neither the zealots nor the detractors have ever seen them as: just another maker of computer hardware. They flooded the market with more models than even the most dutiful fans could keep up with, none of them evincing the flair for design that marks the Macs of earlier or later eras. Their computers’ bland cases were matched with bland names like “Performa” or “Quadra” — names which all too easily could have come out of Compaq or (gasp!) IBM rather than Apple. Even the tight coupling of hardware and software into a single integrated user experience, another staple of Apple computing before and after, threatened to disappear, as CEO John Sculley took to calling Apple a “software company” and intimated that he might be willing to license MacOS to other manufacturers in the way that Microsoft did MS-DOS and Windows. At the same time, in a bid to protect the software crown jewels, he launched a prohibitively expensive and ethically and practically ill-advised lawsuit against Microsoft for copying MacOS’s “look and feel” in Windows.

Apple’s attempts to woo corporate America by acting just as bland and conventional as everyone else bore little fruit; the Macintosh itself remained too incompatible, too expensive, and too indelibly strange to lure cautious purchasing managers into the fold. Meanwhile Apple’s prices remained too high for any but the most well-heeled private users. And so the Mac soldiered on with a 5 to 10 percent market share, buoyed by a fanatically loyal user base who still saw revolutionary potential in it, even as they complained about how many of its ideas Microsoft and others had stolen. Admittedly, their numbers were not insignificant: there were about 3 and a half million members of the Macintosh family by 1990. They were enough to keep Apple afloat and basically profitable, at least for now, but already by the early 1990s most new Macs were being sold “within the family,” as it were. The Mac became known as the platform where the visionaries tried things out; if said things proved promising, they then reached the masses in the form of Windows implementations. CD-ROM, the most exciting new technology of the early 1990s, was typical. The Mac pioneered this space; Mediagenic’s The Manhole, the very first CD-ROM entertainment product, shipped first on that platform. Yet most of the people who heard the hype and went out to buy a “multimedia PC” in the years that followed brought home a Wintel machine. The Mac was a sort of aspirational showpiece platform; in defiance of the Mac’s old “computer for the rest of us” tagline, Windows was the place where the majority of ordinary people did ordinary things.

The state of MacOS added weight to these showhorse-versus-workhorse stereotypes. Its latest incarnation, known as System 6, had fallen alarmingly behind the state of the art in computing by 1990. Once one looked beyond its famously intuitive and elegant user interface, one found that it lacked robust support for multitasking; lacked for ways to address memory beyond 8 MB; lacked the virtual memory that would allow users to open more and larger applications than the physical memory allowed; lacked the memory protection that could prevent errant applications from taking down the whole system. Having been baked into many of the operating system’s core assumptions from the start — MacOS had originally been designed to run on a machine with no hard drive and just 128 K of memory — these limitations were infuriatingly difficult to remedy after the fact. Thus Apple struggled mightily with the creation of a System 7, their attempt to do just that. When System 7 finally shipped in May of 1991, two years after Apple had initially promised it would, it still lagged behind Windows under the hood in some ways: for example, it still lacked comprehensive memory protection.

The problems which dogged the Macintosh were typical of any computing platform that attempts to survive beyond the technological era which spawned it. Keeping up with the times means hacking and kludging the original vision, as efficiency and technical elegance give way to the need just to make it work, by hook or by crook. The original Mac design team had been given the rare privilege of forgetting about backward compatibility — given permission to build something truly new and “insanely great,” as Steve Jobs had so memorably put it. That, needless to say, was no longer an option. Every decision at Apple must now be made with an eye toward all of the software that had been written for the Mac in the past seven years or so. People depended on it now, which sharply limited the ways in which it could be changed; any new idea that wasn’t compatible with what had come before was an ipso-facto nonstarter. Apple’s clever programmers doubtless could have made a faster, more stable, all-around better operating system than System 7 if they had only had free rein to do so. But that was pie-in-the-sky talk.

Yet the most pressing of all the technical problems confronting the Macintosh as it aged involved its hardware rather than its software. Back in 1984, the design team had hitched their wagon to the slickest, sexiest new CPU in the industry at the time: the Motorola 68000. And for several years, they had no cause to regret that decision. The 68000 and its successor models in the same family were wonderful little chips — elegant enough to live up to even the Macintosh ideal of elegance, an absolute joy to program. Even today, many an old-timer will happily wax rhapsodic about them if given half a chance. (Few, for the record, have similarly fond memories of Intel’s chips.)

But Motorola was both a smaller and a more diversified company than Intel, the international titan of chip-making. As time went on, they found it more and more difficult to keep up with the pace set by their rival. Lacking the same cutting-edge fabrication facilities, it was hard for them to pack as many circuits into the same amount of space. Matters began to come to a head in 1989, when Intel released the 80486, a chip for which Motorola had nothing remotely comparable. Motorola’s response finally arrived in the form of the roughly-equivalent-in-horsepower 68040 — but not until more than a year later, and even then their chip was plagued by poor heat dissipation and heavy power consumption in many scenarios. Worse, word had it that Motorola was getting ready to give up on the whole 68000 line; they simply didn’t believe they could continue to compete head-to-head with Intel in this arena. One can hardly overstate how terrifying this prospect was for Apple. An end to the 68000 line must seemingly mean the end of the Macintosh, at least as everyone knew it; MacOS, along with every application ever written for the platform, were inextricably bound to the 68000. Small wonder that John Sculley started talking about Apple as a “software company.” It looked like their hardware might be going away, whether they liked it or not.

Motorola was, however, peddling an alternative to the 68000 line, embodying one of the biggest buzzwords in computer-science circles at the time: “RISC,” short for “Reduced Instruction Set Chip.” Both the Intel x86 line and the Motorola 68000 line were what had been retroactively named “CISC,” or “Complex Instruction Set Chips”: CPUs whose set of core opcodes — i.e., the set of low-level commands by which they could be directly programmed — grew constantly bigger and more baroque over time. RISC chips, on the other hand, pared their opcodes down to the bone, to only those commands which they absolutely, positively could not exist without. This made them less pleasant for a human programmer to code for — but then, the vast majority of programmers were working by now in high-level languages rather than directly controlling the CPU in assembly language anyway. And it made programs written to run on them by any method bigger, generally speaking — but then, most people by 1990 were willing to trade a bit more memory usage for extra speed. To compensate for these disadvantages, RISC chips could be simpler in terms of circuitry than CISC chips of equivalent power, making them cheaper and easier to manufacture. They also demanded less energy and produced less heat — the computer engineer’s greatest enemy — at equivalent clock speeds. As of yet, only one RISC chip was serving as the CPU in mass-market personal computers: the ARM chip, used in the machines of the British PC maker Acorn, which weren’t even sold in the United States. Nevertheless, Motorola believed RISC’s time had come. By switching to RISC, they wouldn’t need to match Intel in terms of transistors per square millimeter to produce chips of equal or greater speed. Indeed, they’d already made a RISC CPU of their own, called the 88000, in which they were eager to interest Apple.

They found a receptive audience among Apple’s programmers and engineers, who loved Motorola’s general design aesthetic. Already by the spring of 1990, Apple had launched two separate internal projects to study the possibilities for RISC in general and the 88000 in particular. One, known as Project Jaguar, envisioned a clean break with the past, in the form of a brand new computer that would be so amazing that people would be willing to accept that none of their existing software would run on it. The other, known as Project Cognac, studied whether it might be possible to port the existing MacOS to the new architecture, and then — and this was the really tricky part — find a way to make existing applications which had been compiled for a 68000-based Mac run unchanged on the new machine.

At first, the only viable option for doing so seemed to be a sort of Frankenstein’s monster of a computer, containing both an 88000- and a 68000-series CPU. The operating system would boot and run on the 88000, but when the user started an application written for an older, 68000-based Mac, it would be automatically kicked over to the secondary CPU. Within a few years, so the thinking went, all existing users would upgrade to the newer models, all current software would get recompiled to run natively on the RISC chip, and the 68000 could go away. Still, no one was all that excited by this approach; it seemed the worst Macintosh kludge yet, the very antithesis of what the machine was supposed to be.

A eureka moment came in late 1990, with the discovery of what Cognac project leader Jack McHenry came to call the “90/10 Rule.” Running profilers on typical applications, his team found that in the case of many or most of them it was the operating system, not the application itself, that consumed 90 percent or more of the CPU cycles. This was an artifact — for once, a positive one! — of the original MacOS design, which offered programmers an unprecedentedly rich interface toolbox meant to make coding as quick and easy as possible and, just as importantly, to give all applications a uniform look and feel. Thus an application simply asked for a menu containing a list of entries; it was then the operating system that did all the work of setting it up, monitoring it, and reporting back to the application when the user chose something from it. Ditto buttons, dialog boxes, etc. Even something as CPU-intensive as video playback generally happened through the operating system’s QuickTime library rather than the application actually employing it.

All of this meant that it ought to be feasible to emulate the 68000 entirely in software. The 68000 code would necessarily run slowly and inefficiently through emulation, wiping out all of the speed advantages of the new chip and then some. Yet for many or most applications the emulator would only need to be used about 10 percent of the time. The other 90 percent of the time, when the operating system itself was doing things at native speed, would more than make up for it. In due course, applications would get recompiled and the need for 68000 emulation would largely go away. But in the meanwhile, it could provide a vital bridge between the past and the future — a next-generation Mac that wouldn’t break continuity with the old one, all with a minimum of complication, for Apple’s users and for their hardware engineers alike. By mid-1991, Project Cognac had an 88000-powered prototype that could run a RISC-based MacOS and legacy Mac applications together.

And yet this wasn’t to be the final form of the RISC-based Macintosh. For, just a few months later, Apple and IBM made an announcement that the technology press billed — sometimes sarcastically, sometimes earnestly — as the “Deal of the Century.”

Apple had first begun to talk with IBM in early 1990, when Michael Spindler, the former’s president, had first reached out to Jack Kuehler, his opposite number at IBM. It seemed that, while Apple’s technical rank and file were still greatly enamored with Motorola, upper management was less sanguine. Having been burned once with the 68000, they were uncertain about Motorola’s commitment and ability to keep evolving the 88000 over the long term.

It made a lot of sense in the abstract for any company interested in RISC technology, as Apple certainly was, to contact IBM; it was actually IBM who had invented the RISC concept back in the mid-1970s. Not all that atypically for such a huge company with so many ongoing research projects, they had employed the idea for years only in limited, mostly subsidiary usage scenarios, such as mainframe channel controllers. Now, though, they were just introducing a new line of “workstation computers” — meaning extremely high-powered desktop computers, too expensive for the consumer market — which used a RISC chip called the POWER CPU that was the heir to their many years of research in the field. Like the workstations it lay at the heart of, the chip was much too expensive and complex to become the brain of Apple’s next generation of consumer computers, but it might, thought Spindler, be something to build upon. And he knew that, with IBM’s old partnership with Microsoft slowly collapsing into bickering acrimony, Big Blue might just be looking for a new partner.

The back-channel talks were intermittent and hyper-cautious at first, but, as the year wore on and the problems both of the companies faced became more and more obvious, the discussions heated up. The first formal meeting took place in February of 1991 or shortly thereafter, at an IBM facility in Austin, Texas. The Apple people, knowing IBM’s ultra-conservative reputation and wishing to make a good impression, arrived neatly groomed and dressed in three-piece suits, only to find their opposite numbers, having acted on the same motivation, sitting there in jeans and denim shirts.

That anecdote illustrates how very much both sides wanted to make this work. And indeed, the two parties found it much easier to work together than anyone might have imagined. John Sculley, the man who really called the shots at Apple, found that he got along smashingly with Jack Kuehler, to the extent that the two were soon talking almost every day. After beginning as a fairly straightforward discussion of whether IBM might be able and willing to make a RISC chip suitable for the Macintosh, the negotiations just kept growing in scale and ambition, spurred on by both companies’ deep-seated desire to stick it to Microsoft and the Wintel hegemony in any and all possible ways. They agreed to found a joint subsidiary called Taligent, staffed initially with the people from Apple’s Project Jaguar, which would continue to develop a brand new operating system that could be licensed by any hardware maker, just like MS-DOS and Windows (and for that matter IBM’s already extant OS/2). And they would found another subsidiary called Kaleida Labs, to make a cross-platform multimedia scripting engine called ScriptX.

Still, the core of the discussions remained IBM’s POWER architecture — or rather the PowerPC, as the partners agreed to call the cost-reduced, consumer-friendly version of the chip. Apple soon pulled Motorola into these parts of the talks, thus turning a bilateral into a trilateral negotiation, and providing the name for their so-called “AIM alliance” — “AIM” for Apple, IBM, and Motorola. IBM had never made a mass-market microprocessor of their own before, noted Apple, and Motorola’s experience could serve them well, as could their chip-fabrication facilities once actual production began. The two non-Apple parties were perhaps less excited at the prospect of working together — Motorola in particular must have been smarting at the rejection of their own 88000 processor which this new plan would entail — but made nice and got along.

Jack Kuehler and John Sculley brandish what they call their “marriage certificate,” looking rather disturbingly like Neville Chamberlain declaring peace in our time. The marriage would not prove an overly long or happy one.

On October 2, 1991 — just six weeks after the first 68040-based Macintosh models had shipped — Apple and IBM made official the rumors that had been swirling around for months. At a joint press briefing held inside the Fairmont Hotel in downtown San Francisco, they trumpeted all of the initiatives I’ve just described. The Deal of the Century, they said, would usher in the next phase of personal computing. Wintel must soon give way to the superiority of a PowerPC-based computer running a Taligent operating system with ScriptX onboard. New Apple Macintosh models would also use the PowerPC, but the relationship between them and these other, Taligent-powered machines remained vague.

Indeed, it was all horribly confusing. “What Taligent is doing is not designed to replace the Macintosh,” said Sculley. “Instead we think it complements and enhances its usefulness.” But what on earth did that empty corporate speak even mean? When Apple said out of the blue that they were “not going to do to the Macintosh what we did to the Apple II” — i.e., orphan it — it rather made you suspect that that was exactly what they meant to do. And what did it all mean for IBM’s OS/2, which Big Blue had been telling a decidedly unconvinced public was also the future of personal computing for several years now? “I think the message in those agreements for the future of OS/2 is that it no longer has a future,” said one analyst. And then, what was Kaleida and this ScriptX thing supposed to actually do?

So much of the agreement seemed so hopelessly vague. Compaq’s vice president declared that Apple and IBM must be “smoking dope. There’s no way it’s going to work.” One pundit called the whole thing “a con job. There’s no software, there’s no operating system. It’s just a last gasp of extinction by the giants that can’t keep up with Intel.” Apple’s own users were baffled and consternated by this sudden alliance with the company which they had been schooled to believe was technological evil incarnate. A grim joke made the rounds: what do you get when you cross Apple and IBM? The answer: IBM.

While the journalists reported and the pundits pontificated, it was up to the technical staff at Apple, IBM, and Motorola to make PowerPC computers a reality. Like their colleagues who had negotiated the deal, they all got along surprisingly well; once one pushed past the surface stereotypes, they were all just engineers trying to do the best work possible. Apple’s management wanted the first PowerPC-based Macintosh models to ship in January of 1994, to commemorate the platform’s tenth anniversary by heralding a new technological era. The old Project Cognac team, now with the new code name of “Piltdown Man” after the famous (albeit fraudulent) “missing link” in the evolution of humanity, was responsible for making this happen. For almost a year, they worked on porting MacOS to the PowerPC, as they’d previously done to the 88000. This time, though, they had no real hardware with which to work, only specifications and software emulators. The first prototype chips finally arrived on September 3, 1992, and they redoubled their efforts, pulling many an all-nighter. Thus MacOS booted up to the desktop for the first time on a real PowerPC-based machine just in time to greet the rising sun on the morning of October 3, 1992. A new era had indeed dawned.

Their goal now was to make a PowerPC-based Macintosh work exactly like any other, only faster. MacOS wouldn’t even get a new primary version number for the first PowerPC release; this major milestone in Mac history would go under the name of System 7.1.2, a name more appropriate to a minor maintenance release. It looked so identical to what had come before that its own creators couldn’t spot the difference; they wound up lighting up a single extra pixel in the PowerPC version just so they could know which was which.

Their guiding rule of an absolutely seamless transition applied in spades to the 68000 emulation layer, duly ported from the 88000 to the PowerPC. An ordinary user should never have to think about — should not even have to know about — the emulation that was happening beneath the surface. Another watershed moment came in June of 1993, when the team brought a PowerPC prototype machine to the MacHack, a coding conference and competition. Without telling any of the attendees what was inside the machine, the team let them use it to demonstrate their boundary-pushing programs. The emulation layer performed beyond their most hopeful prognostications. It looked like the Mac’s new lease on life was all but a done deal from the engineering side of things.

But alas, the bonhomie exhibited by the partner companies’ engineers and programmers down in the trenches wasn’t so marked in their executive suites after the deal was signed. The very vagueness of so many aspects of the agreement had papered over what were in reality hugely different visions of the future. IBM, a company not usually given to revolutionary rhetoric, had taken at face value the high-flown words spoken at the announcement. They truly believed that the agreement would mark a new era for personal computing in general, with a new, better hardware architecture in the form of PowerPC and an ultra-modern operating system to run on it in the form of Taligent’s work. Meanwhile it was becoming increasingly clear that Apple’s management, who claimed to be changing the world five times before breakfast on most days, had in reality seen Taligent largely as a hedge in case their people should prove unable to create a PowerPC Macintosh that looked like a Mac, felt like a Mac, and ran vintage Mac software. As Project Piltdown Man’s work proceeded apace, Apple grew less and less enamored with those other, open-architecture ideas IBM was pushing. The Taligent people didn’t help their cause by falling headfirst into a pit of airy computer-science abstractions and staying mired there for years, all while Project Piltdown Man just kept plugging away, getting things done.

The first two and a half years of the 1990s were marred by a mild but stubborn recession in the United States, during which the PC industry had a particularly hard time of it. After the summer of 1992, however, the economy picked up steam and consumer computing eased into what would prove its longest and most sustained boom of all time, borne along on a wave of hype about CD-ROM and multimedia, along with the simple fact that personal computers in general had finally evolved to a place where they could do useful things for ordinary people in a reasonably painless way. (A bit later in the boom, of course, the World Wide Web would come along to provide the greatest impetus of all.)

And yet the position of both Apple and IBM in the PC marketplace continued to get steadily worse while the rest of their industry soared. At least 90 percent of the computers that were now being sold in such impressive numbers ran Microsoft Windows, leaving OS/2, MacOS, and a few other oddballs to divide the iconoclasts, the hackers, and the non-conformists of the world among themselves. While IBM continued to flog OS/2, more out of stubbornness than hope, Apple tried a little bit of everything to stop the slide in market share and remain relevant. Still not entirely certain whether their future lay with open architectures or their own closed, proprietary one, they started porting selected software to Windows, including most notably QuickTime, their much-admired tool for encoding and playing video. They even shipped a Mac model that could also run MS-DOS and Windows, thanks to an 80486 housed in its case alongside its 68040. And they entered into a partnership with the networking giant Novell to port MacOS itself to Intel hardware — a partnership that, like many Apple initiatives of these years, petered out without ultimately producing much of anything. Perhaps most tellingly of all, this became the only period in Apple’s history when the company felt compelled to compete solely on price. They started selling Macs in department stores for the first time, where a stream of very un-Apple-like discounts and rebates greeted prospective buyers.

While Apple thus toddled along without making much headway, IBM began to annihilate all previous conceptions of how much money a single company could possibly lose, posting oceans of red that looked more like the numbers found in macroeconomic research papers than entries in an accountant’s books. The PC marketplace was in a way one of their smaller problems. Their mainframe business, their real bread and butter since the 1950s, was cratering as customers fled to the smaller, cheaper computers that could often now do the jobs of those hulking giants just as well. In 1991, when IBM first turned the corner into loss, they did so in disconcertingly convincing fashion: they lost $2.82 billion that year. And that was only the beginning. Losses totaled $4.96 billion in 1992, followed by $8.1 billion in 1993. IBM lost more money during those three years alone than any other company in the history of the world to that point; their losses exceeded the gross domestic product of Ecuador.

The employees at both Apple and IBM paid the toll for the confusions and prevarications of these years: both companies endured rounds of major layoffs. Those at IBM marked the very first such in the long history of the company. Big Blue had for decades fostered a culture of employment for life; their motto had always been, “If you do your job, you will always have your job.” This, it was now patently obvious, was no longer the case.

The bloodletting at both companies reached their executive suites as well within a few months of one another. On April 1, 1993, John Akers, the CEO of IBM, was ousted after a seven-year tenure which one business writer called “the worst record of any chief executive in the history of IBM.” Three months later, following a terrible quarterly earnings report and a drop in share price of 58 percent in the span of six months, Michael Spindler replaced John Sculley as the CEO of Apple.

These, then, were the storm clouds under which the PowerPC architecture became a physical reality.

The first PowerPC computers to be given a public display bore an IBM rather than an Apple logo on their cases. They arrived at the Comdex trade show in November of 1993, running a port of OS/2. IBM also promised a port of AIX — their version of the Unix operating system — while Sun Microsystems announced plans to port their Unix-based Solaris operating system and, most surprisingly of all, Microsoft talked about porting over Windows NT, the more advanced, server-oriented version of their world-conquering operating environment. But, noted the journalists present, “it remains unclear whether users will be able to run Macintosh applications on IBM’s PowerPC” — a fine example of the confusing messaging the two alleged allies constantly trailed in their wake. Further, there was no word at all about the status of the Taligent operating system that was supposed to become the real PowerPC standard.

Meanwhile over at Apple, Project Piltdown Man was becoming that rarest of unicorns in tech circles: a major software-engineering project that is actually completed on schedule. The release of the first PowerPC Macs was pushed back a bit, but only to allow the factories time to build up enough inventory to meet what everyone hoped would be serious consumer demand. Thus the “Power Macs” made their public bow on March 14, 1994, at New York City’s Lincoln Center, in three different configurations clocked at speeds between 60 and 80 MHz. Unlike IBM’s machines, which were shown six months before they shipped, the Power Macs were available for anyone to buy the very next day.

The initial trio of Power Macs.

This speed test, published in MacWorld magazine, shows how all three of the Power Mac machines dramatically outperform top-of-the-line Pentium machines when running native code.

They were greeted with enormous excitement and enthusiasm by the Mac faithful, who had been waiting anxiously for a machine that could go head-to-head with computers built around Intel’s new Pentium chip, the successor to the 80486. This the Power Macs could certainly do; by some benchmarks at least, the PowerPC doubled the overall throughput of a Pentium. World domination must surely be just around the corner, right?

Predictably enough, the non-Mac-centric technology press greeted the machines’ arrival more skeptically than the hardcore Mac-heads. “I think Apple will sell [a] million units, but it’s all going to be to existing Mac users,” said one market researcher. “DOS and Windows running on Intel platforms is still going to be 85 percent of the market. [The Power Mac] doesn’t give users enough of a reason to change.” Another noted that “the Mac users that I know are not interested in using Windows, and the Windows users are not interested in using the Mac. There has to be a compelling reason [to switch].”

In the end, these more guarded predictions proved the most accurate. Apple did indeed sell an impressive spurt of Power Macs in the months that followed, but almost entirely to the faithful. One might almost say that they became a victim of Project Piltdown Man’s success: the Power Mac really did seem exactly like any other Macintosh, except that it ran faster. And even this fact could be obscured when running legacy applications under emulation, as most people were doing in the early months: despite Project Piltdown Man’s heroic efforts, applications like Excel, Word, and Photoshop actually ran slightly slower on a Power Mac than on a top-of-the-line 68040-based machine. So, while the transition to PowerPC allowed the Macintosh to persist as a viable computing platform, it ultimately did nothing to improve upon its small market share. And because the PowerPC MacOS was such a direct and literal port, it still retained all of the shortcomings of MacOS in general. It remained a pretty interface stretched over some almost laughably archaic plumbing. The new generation of Mac hardware wouldn’t receive an operating system truly, comprehensively worthy of it until OS X arrived seven long years later.

Still, these harsh realities shouldn’t be allowed to detract from how deftly Apple — and particularly the unsung coders of Project Piltdown Man — executed the transition. No one before had ever picked up a consumer-computing platform bodily and moved it to an entirely new hardware architecture at all, much less done it so transparently that many or most users never really had to think about what was happening at all. (There would be only one comparable example in computing’s future. And, incredibly, the Mac would once again be the platform in question: in 2006, Apple would move from the fading PowerPC line to Intel’s chips — if you can’t beat ’em, join ’em, right? — relying once again on a cleverly coded software emulator to see them through the period of transition. The Macintosh, it seems, has more lives than Lazarus.)

Although the briefly vaunted AIM alliance did manage to give the Macintosh a new lease on life, it succeeded in very little else. The PowerPC architecture, which had cost the alliance more than $1 billion to develop, went nowhere in its non-Mac incarnations. IBM’s own machines sold in such tiny numbers that the question of whether Apple would ever allow them to run MacOS was all but rendered moot. (For the record, though: they never did.) Sun Solaris and Microsoft Windows NT did come out in PowerPC versions, but their sales couldn’t justify their existence, and within a year or two they went away again. The bold dream of creating a new reference platform for general-purpose computing to rival Wintel never got off the ground, as it became painfully clear that said dream had been taken more to heart by IBM than by Apple. Only after the millennium would the PowerPC architecture find a measure of mass-market success outside the Mac, when it was adopted by Nintendo, Microsoft, and Sony for use in videogame consoles. In this form, then, it finally paid off for IBM; far more PowerPC-powered consoles than even Macs were sold over the lifetime of the architecture. PowerPC also eventually saw use in other specialized applications, such as satellites and planetary rovers employed by NASA.

Success, then, is always relative. But not so the complete lack thereof, as Kaleida and Taligent proved. Kaleida burned through $200 million before finally shipping its ScriptX multimedia-presentation engine years after other products, most notably Macromedia’s Director, had already sewn up that space; it was disbanded and harvested for scraps by Apple in November of 1995. Taligent burned through a staggering $400 million over the same period of time, producing only some tepid programming frameworks in lieu of the revolutionary operating system that had been promised, before being absorbed back into IBM.

There is one final fascinating footnote to this story of a Deal of the Century that turned out to be little more than a strange anecdote in computing history. In the summer of 1994, IBM, having by now stopped the worst of the bleeding, settling by now into their new life as a smaller, far less dominant company, offered to buy Apple outright for a premium of $5 over their current share price. In IBM’s view, the synergies made sense: the Power Macs were selling extremely well, which was more than could be said for IBM’s PowerPC models. Why not go all in?

Ironically, it was those same healthy sales numbers that scuppered the deal in the end. If the offer had come a year earlier, when a money-losing Apple was just firing John Sculley, they surely would have jumped at it. But now Apple was feeling their oats again, and by no means entirely without reason; sales were up more than 20 percent over the previous year, and the company was once more comfortably in the black. So, they told IBM thanks, but no thanks. The same renewed taste of success also caused them to reject serious inquiries from Philips, Sun Microsystems, and Oracle. Word had it that new CEO Michael Spindler was convinced not only that the Power Mac had saved Apple, but that it had fundamentally altered their position in the marketplace.

The following year revealed how misguided that thinking really was; the Power Mac had fixed none of Apple’s fundamental problems. That year it was Microsoft who cemented their world domination instead, with the release of Windows 95, while Apple grappled with the reality that almost all of those Power Mac sales of the previous year had been to existing members of the Macintosh family, not to the new customers they so desperately needed to attract. What happened now that everyone in the family had dutifully upgraded? The answer to that question wasn’t pretty: Apple plunged off a financial cliff as precipitous in its own way as the one which had nearly destroyed IBM a few years earlier. Now, nobody was interested in acquiring them anymore. The pundits smelled the stink of death; it’s difficult to find an article on Apple written between 1995 and 1998 which doesn’t include the adjective “beleaguered.” Why buy now when you can sift through the scraps at the bankruptcy auction in just a little while?

Apple didn’t wind up dying, of course. Instead a series of improbable events, beginning with the return of prodigal-son Steve Jobs in 1997, turned them into the richest single company in the world — yes, richer even than Microsoft. These are stories for other articles. But for now, it’s perhaps worth pausing for a moment to think about an alternate timeline where the Macintosh became an IBM product, and the Deal of the Century that got that ball rolling thus came much closer to living up to its name. Bizarre, you say? Perhaps. But no more bizarre than what really happened.

(Sources: the books Insanely Great: The Life and Times of Macintosh by Steven Levy, Apple Confidential 2.0: The Definitive History of the World’s Most Colorful Company by Owen W. Linzmayer, Infinite Loop: How the World’s Most Insanely Great Computer Company Went Insane by Michael S. Malone, Big Blues: The Unmaking of IBM by Paul Carroll, and The PowerPC Macintosh Book by Stephan Somogyi; InfoWorld of September 24 1990, October 15 1990, December 3 1990, April 8 1991, May 13 1991, May 27 1991, July 1 1991, July 8 1991, July 15 1991, July 22 1991, August 5 1991, August 19 1991, September 23 1991, September 30 1991, October 7 1991, October 21 1991, November 4 1991, December 30 1991, January 13 1992, January 20 1992, February 3 1992, March 9 1992, March 16 1992, March 23 1992, April 27 1992, May 11 1992, May 18 1992, June 15 1992, June 29 1992, July 27 1992, August 3 1992, August 10 1992, August 17 1992, September 7 1992, September 21 1992, October 5 1992, October 12 1992, October 19 1992, December 14 1992, December 21 1992, December 28 1992, January 11 1993, February 1 1993, February 22 1993, March 8 1993, March 15 1993, April 5 1993, April 12 1993, May 17 1993, May 24 1993, May 31 1993, June 21 1993, June 28 1993, July 5 1993, July 12 1993, July 19 1993, August 2 1993, August 9 1993, August 30 1993, September 6 1993, September 27 1993, October 4 1993, October 11 1993, October 18 1993, November 1 1993, November 15 1993, November 22 1993, December 6 1993, December 13 1993, December 20 1993, January 10 1994, January 31 1994, March 7 1994, March 14 1994, March 28 1994, April 25 1994, May 2 1994, May 16 1994, June 6 1994, June 27 1994; MacWorld of September 1992, February 1993, July 1993, September 1993, October 1993, November 1993, February 1994, and May 1994; Byte of November 1984. Online sources include IBM’s own corporate-history timeline and a vintage IBM lecture on the PowerPC architecture.)