This would be fair, considering his company was about to announce a sudden 90% plunge in profits. So it's understandable that, when I asked him about Nvidia's recent coup, getting Apple to swap out Intel product for GeForce 9400M chipset, he said with more than a hint of disdain, "You're obviously a Mac user." Here's a guy who is used to making judgments, and doing it quickly.

The biggest update to the new MacBooks—on the inside anyway—is their graphical muscle, which has…
Read more Read more

But when I told him I also built my desktop with an Intel Core 2 Duo Wolfdale chip, he reversed his decision. Laughing, he said, "You're alright for a kid that wears black Keds." This wasn't his first reference to my sneakers—they were Adidas, actually—and it wasn't his last either.

At 69, he is definitely one of the oldest guys running a powerhouse innovation company like Intel, and when he's sitting there in front of you, he conveys an attitude that he's seen it all. He hung up his labcoat for a tailored suit long ago, but talking to him, you can still tell that his degree from Stanford isn't some MBA, but a PhD in materials science. Nerdspeak flows easily out of his mouth, and he closes his eyes while calmly making a point, like a college professor. At the same, you get a sense of the agitation within. After all, he'll be the first to tell you that in business, he still lives by the mantra of his Intel CEO predecessor Andy Grove: "Only the paranoid survive."

In the end, I really liked the guy. He's tough but fair, like an Old Testament king. Here are excerpts from our conversation, chip guru to chip fanboy, about vanquishing your competition, the limitations of clock speed, the continuing rage of the multi-core race and how to keep paranoid in your golden years.

What's the endgame of the multi-core arms race? Is there one?
If everything works well, they continue to get Moore's Law from a compute power standpoint. [But] you need software solutions to go hand-in-hand with software solutions...There's a whole software paradigm shift that has to be happen.

How involved is Intel in the software side of making that happen?
Probably the best measure is that if look at the people we hire each year, we still hire more software engineers than hardware engineers.

Where do you see Larrabee, Intel's in-development, dedicated high-end GPU, taking you?
The fundamental issue is that performance has to come from something other than gigahertz... We've gotten to the limit we can, so you've got to do something else, which is multiple cores, and then it's either just partitioning solutions between cores of the same type or partitioning solutions between heterogeneous cores on the same chip.

About a year ago, we first brought you news on Intel's Larrabee multi-cored GPU chips, but…
Read more Read more

You see, everybody's kind of looking at the same thing, which is, 'How do I mix and match a CPU- and a GPU-type core, or six of these and two of those, and how do you have the software solution to go hand-in-hand?'

So what do you think of the competition coming from Nvidia lately?
At least someone is making very verbal comments about the competition anyway.

Do you see Nvidia as more of a competitor than AMD? How do you see the competitive landscape now?
We still operate under the Andy Grove scenario that only the paranoid survive, so we tend to be paranoid about where competition comes from any direction. If you look at the Intel history, our major competitor over the years has been everybody from IBM to NEC to Sun to AMD to you-name-it. So the competition continually changes, just as the flavor of technology changes.

As visualization becomes more important—and visualization is key to what you and consumers want—then is it the CPU that's important, or the GPU, or what combination of the two and how do you get the best visualization? The competitive landscape changes daily. Nvidia is obviously more of a competitor today than they were five years ago. AMD is still a competitor.

Would you say the same competitive philosophy applies to the mobile space?
Two different areas, obviously. The netbook is really kind of a slimmed down laptop. The Atom processor takes us in that space nicely from a power/performance standpoint. Atom allows you to go down farther in this kind of fuzzy area in between netbooks, MIDs [mobile internet devices] and smartphones. The question there is, 'What does the consumer want?'

The issue is, 'What is the ultimate device in that space?' ...Is it gonna be an extension of the internet coming down, or there gonna be an upgrowth of the cellphone coming up?

Are you planning on playing more directly in phones, then?
Those MIDs look more and more like smartphones to me...All they need to do is shrink down a little bit and they're a damn good smartphone. They have the capability of being a full-internet-functionality smartphone as opposed to an ARM-based one—maybe it looks like the internet you're used to or, maybe it doesn't.

Intel and Microsoft "won" the PC Revolution. There's a computer on basically every office desk in the country. What's beyond that? Mobile, developing countries?
Well, it's a combination. There's an overriding trend toward mobility for convenience. We can shrink the capability down to put it in a mobile form factor, and the cost is not that much more than a desktop, point one. Point two, if you go to the emerging economies where you think that mobile might be lacking, really the only way to get good broadband connectivity in most of the emerging markets is not with wired connectivity or fixed point connectivity, it's gonna be broadband wireless and that facilitates mobile in emerging markets as well.

So where does that take Intel going in the next five years?
It's pushing things like broadband wireless, WiMax...It's broadband wireless capability, that's the connectivity part. It's mobility with more compute power and lower energy consumption to facilitate battery life and all that good stuff. And it's better graphics. That's kind of Larrabee and that whole push.

You've passed AMD on every CPU innovation that it had before you did, such as on-die memory controllers, focus on performance per watt, etc. How do you plan to stay ahead?
The basic way you stay ahead is that you have to set yourself with aggressive expectations. There's nothing in life that comes free. You're successful when you set your expectations high enough to beat the competition. And I think the best thing that we have going for us is...the Moore's Law deal.

As long as we basically don't lose sight of that, and continue to push all of our roadmaps, all of our product plans and such to follow along Gordon's law, then we have the opportunity to stay ahead. That doubling every 18 months or so is the sort of expectation level you have to set for yourself to be successful.

Would you consider that the guiding philosophy, the banner on the wall?
That's the roadmap! That is the roadmap we have. If you dissect a bit, you tend to find that the older you get, the more conservative you get typically and you kinda start to worry about Moore's Law not happening. But if you bring the bright young talent and say, 'Hey, bright young talent, we old guys made Moore's Law happen for 40 years, don't screw it up,' they're smart enough to figure it out.