David Yoffie, professor and strategy expert at the Harvard Business School, talks about Apple Computer with equal parts of detachment, frustration, admiration, and amusement. And he knows whereof he speaks: He was a close advisor to John Sculley in the early 1990s and was well connected in the high-tech community in general. Today, he sits on Intel’s board, so he knew sooner than most of the world that in the spring of 2005, Steve Jobs would startle the Apple community once again by embracing Intel chips for the next generation of Apple computers.

In fact, Yoffie himself had tried to broker exactly the same marriage more than a decade earlier. In his role as wise counselor, he arranged a meeting between Intel’s Andy Grove and John Sculley, hoping that a face-to-face encounter with the legendary Grove finally would force Sculley to come to terms with some harsh realities. “There is no way,” Grove told Sculley, as the grim slides rolled by, one after another, “that IBM can make the microprocessors you need. Only Intel can do that.

Look at our range of products versus theirs. Look at our volume versus theirs. It’s not even close.” Sculley was clearly shaken by Grove’s presentation, which included graphic summaries of Intel’s enormous chip volume. He promised Yoffie that he would think hard about what he had just heard.

A week later, Yoffie had dinner with a group of senior Apple managers, who—although they hadn’t been in the room with Sculley and Grove—had since seen a version of Grove’s presentation. “Intel must be selling a lot of microprocessors into toasters,” one of the managers said offhandedly.

Yoffie was puzzled. “What do you mean?”

“Well,” responded the manager, “God forbid that they’re selling them all into computers! If they were, well, that would be bad news for us!”

But of course all (or almost all) of those microprocessors were going into computers— millions and millions of Wintel computers— and Apple was already en route to its marginal position in the PC industry.

Which leads us to the first of six strategic lessons that Yoffie draws from the long and turbulent saga of Apple Computer:

■ Figure out the true competitive landscape.

This, of course, is easier said than done. But more than most companies, Apple was insulated from reality. What happened when Apple managers and technical experts actually went out and talked to customers? They had a love-fest. See no Wintel, speak no Wintel, hear no Wintel. Somehow, you have to break through the adoration of those closest to you—or those with a stake in the status quo—and figure out what reality looks like. If someone brings Andy Grove in for a chat, take good notes.

■ Don’t bank on being “the best.”

Business history is littered with the corpses of companies that counted on being the absolute best in their business and letting the market come to them. Being the best is not sufficient insurance against being overwhelmed by the second-best, especially if that other player has higher volume and lower prices.

■ Standards will prevail, especially in high tech.

This lesson grows out of the previous one. Crown Cork & Seal, a venerable old manufacturer of bottle-sealing devices, got away with being “only the best” for decades in part because there were no external standards driving that industry. Not true for high tech, where consumers want the benefit of compatibility and interoperability. Standards favor the high-volume players (Wintel) and punish the oddballs (Apple).

■ Timing is everything.

Yoffie points out that Apple actually figured out almost everything that it needed to do, strategically; it just acted on that understanding two or three years too late, time after time after time. In 1985, Bill Gates told Sculley that Apple had to license its software, in a hurry, if it wanted to stay competitive.

Ten years later, Michael Spindler took the first baby steps toward permitting cloning. Both Sculley and Spindler understood the appeal and power of the consumer marketplace; neither moved fast enough or effectively enough toward it.

■ Competitive advantages go away over time.

Let’s say your competitive advantage is the world’s easiest-to-use operating system (ETU/OS). That’s great (assuming that it’s not the kind of oddball that lives too far outside the realm of standards, as described above). But let’s also assume that there’s somebody out there— like, say, Microsoft—that’s steadily closing the ETU/OS gap. Now let’s also assume that every year, almost the entire population of relevant consumers is getting more and more comfortable with technological complexity. As Boston-based IT guru Seth Miller points out, even if you can maintain the ETU/OS gap (no easy job, with the barking hounds of Gates on your heels!), that advantage is simply going to become less important over time.

What is the lesson of this lesson? Innovation is critical. We began this chapter with a quote from John Sculley, to the effect that the best way to predict the future is to invent it. Exactly right, but the inventing has to take place in the company’s wheelhouse, both in terms of production and distribution. People are eagerly awaiting the Apple cell phone, but Yoffie, for one, thinks Steve Jobs will shy away from plunging full-bore into the cell phone business, because the distribution side of cell phones is nightmarishly complicated.

A related lesson is that the smart company innovates across the valuechain. For years, Apple turned up its nose at the low end of the PC field—and got killed, as a result. For about a year after the introduction of the iPod, it looked like Apple might be making the same mistake again: inventing only high-end solutions for Mac users. All of a sudden, though, Apple attacked the other 98 percent of the computer world by bringing out a Windows-compatible iPod. At the same time, it grabbed the low-end MP3 player market (with the flash-memory-based Shuffle) and also introduced moderately priced iPods to cover the middle market segments. Meanwhile, it looked back at the computer business, realized its past mistakes, and brought out the Mac Mini.

■ Success begets success.

Except for a few unshaven contrarians on the fringes of the stock market and the occasional mobster who figures out how to rig a horse race, nobody bets on a loser. When Apple looked like it was sinking beneath the waves in 1995–96, developers stopped developing and consumers stopped consuming. When first the iMac and then the iPod caught fire, people started thinking that just maybe the Little Kingdom would survive, and that buying a Mac might just be a good bet.