Why Obama and McCain Don't Get Tech

If you have any involvement in high technology, the one cringe-worthy moment in this season's presidential debates was this:

"If we create a new energy economy, we can create 5 million new jobs, easily, here in the United States. It can be an engine that drives us into the future the same way the computer was the engine for economic growth over the last couple of decades.

"And we can do it, but we're going to have to make an investment. The same way the computer was originally invented by a bunch of government scientists who were trying to figure out, for defense purposes, how to communicate, we've got to understand that this is a national security issue, as well."

That was from Sen. Barack Obama at last week's presidential debate, describing the role that government can play in driving American business via a more planned economy. But even if one agrees with the first paragraph – and I'm not sure I do, as those new jobs will likely come at the cost of at least as many traditional energy industry jobs, whereas the information processing industry was basically brand new – there's all sorts of things wrong with that second paragraph.

If Sen. John McCain had made that statement, I probably wouldn't have been surprised. Even though we're now past the original canard that the senator didn't even know how to use e-mail – and I hope the people involved in that attack are suitably ashamed for have insulted a disabled war veteran – one still assumes that a 72-year-old man is probably not up on his electronics industry trivia … though that doesn't let him off the hook, as I'll soon explain.

However, Obama is a child of the digital age. He was born after the invention not only of the transistor, but the computer chip. He was eight years old when the microprocessor was invented, and was still in high school when Apple introduced the Apple II personal computer. He, at least, is supposed to know this stuff already.

Now, it's pretty obvious what was wrong in that second graph: What he meant to say was not "computer" but "the Internet" – and I'm sure his handlers would dismiss that as a slip of the tongue. But as Portfolio.com has already noted, "It's not exactly a major gaffe, but it's a mistake that no one who was decently familiar with the technology industry would be likely to make."

That's right. What it sounds like is a guy who memorized this statement, then screwed it up when he finally had to deliver it. And that's pretty dispiriting, because it means that neither candidate -- no matter how sophisticated his online campaign apparatus, no matter how many tech advisers he has on his staff, and no matter how often he references eBay or Google – really has a clue about America's largest manufacturing sector, its greatest source of new job creation, or the dynamo of our economic growth.

How is either man supposed to lead us into the second decade of the 21st century, when he doesn't understand the single most defining cultural and economic force of the last 30 years?

The usual reply is that electronics is pretty complicated stuff, and you can't expect an elected official to understand the nuances of semiconductor architecture, Uniform Resource Locator syntax, and Boolean algebra. True enough. But it has also been almost 70 years since the invention of the computer and the transistor, and almost 40 years since the creation of the Darpa/Arpa/Internet, so is it too much to ask for these folks to at least have a rudimentary understanding of these technologies?

Worse yet, one can make a very strong case that Moore's Law of semiconductors and Metcalfe's Law of networks have done a better job of explaining the course of the last half-century than any other metric – census data, demographics, life expectancies, purchasing power, church affiliations, etc. – beloved by sociologists and futurists.

Yet, despite the fact that Moore's Law, to take the most extreme example, was first formulated in 1964, and probably 30 million Americans who work in tech know it by heart, I'll wager that no more than a handful of our 535 Representatives and Senators know what it is, much less can explain its implications. Given the central role that electronics and high tech plays in the economic health of the nation they represent, and the crucial part it plays in sparking cultural changes – music downloads, the Web, digital television – that ripple across society and lead to the revision of existing law, shouldn't they know this stuff?

As for tech being complicated: sure it is. But to obtain a basic understanding of how it all works (sand to silicon to systems to software to networks, the on/off switch to silicon gates, hardware to software to firmware to applications), and the larger forces (like Metcalfe's Law) takes, oh, about … a half hour.

I know this because I've done it, with schoolchildren, and I'm not that great a teacher. You might think that somewhere in the course of their often decades-long careers that these legislators might have found room in their busy schedules of calling on contributors and attending embassy parties to devote 30 minutes to educating themselves on such an important topic.

But even if they did set aside the time, there's no guarantee they'd actually listen. Exactly once in my 30-year career as a journalist was I asked to explain the digital world to a legislator. More than a decade ago, T.J. Rodgers of Cypress Semiconductor (and these days, SunPower) asked me to come in and give a well-known U.S. senator a quick tutorial on tech. I spent several days paring my presentation down to just 20 minutes.

Fifteen minutes into my presentation, the senator was already looking at his watch. At the 17-minute mark, having learned nothing, the senator excused himself to do what he had really come to Silicon Valley for: hit up T.J. for money.

All of this may seem like just another amusing example of how out-of-touch our elected officials are. But it has dangerous implications. Over the last two decades we have seen Congress (and the administration), out of ignorance, reaction or just plain grandstanding, do almost everything it can to undermine high technology, and especially entrepreneurship, from crushing regulations (which is why there are no tech IPOs anymore) and anti-trust harassment to bizarre accounting requirements on stock options to criminalizing success to falling for the latest scientific scare.

Obama's answer was a classic example of this. If he's talking about the invention of the computer, then you'd be hard-pressed to name any government scientists, at least in the U.S., who were involved in the process. The ENIAC team was a group of academics at Penn; the Mark I was a similar group at Harvard. The same could be said of Alan Turing in England and Conrad Zuse in Germany. Most of them took government money, for sure, but the impetus for that support was World War II – and I doubt either candidate is willing to commit to go to war just to drive technological innovation.

On the other hand, you can make the case the Internet was started by a quasi-government agency, ARPA (later DARPA), as a way for the DoD and contractors (especially universities) to quickly and safely communicate. But it was the World Wide Web, developed by folks at CERN, and released to the general public in 1993, that set off the Internet revolution.

As it happens, I'm old enough and happened to be in the right place as a teenager to have used both government-designed mainframe computers and DARPANET … and trust me, if these two technologies had been left in the hands of government scientists, we would all still be using electric typewriters, rotary phones and doing our research down at the local library. I'll even go so far as to suggest that Silicon Valley was really born when HP stopped accepting specialized government contracts and chip companies like Intel took off in pursuit of the consumer and commercial markets.

Government is best at large-scale basic research – especially when it funds it from private enterprise – and, sometimes, at kick-starting a new commercial industry. But after that, it is usually just an impediment at best and a crusher of innovation at worst. Thus, to use the example of the computer, or even the Internet, as a justification for greater government interference in the marketplace is not only wrong, but dangerous as well.

Meanwhile, once again we have two Presidential candidates (the only exception I can think of in the last 40 years was Al Gore) who don't seem to have a clue about any of this. Given that one of them is going to be establishing this country's economic priorities for the next four years, shouldn't they learn – and quickly?

So, I'll make this offer to both campaigns: Give me (or anybody else who writes about tech) just 30 minutes with your candidate, any time and anywhere, and I'll make sure, for the good of the country, that they never make this kind of mistake again.

TAD'S TAB: Continuing last week's visual theme, consider the art of Larry Roibal. Every few days, he posts portrait drawings on the Web. What's interesting about them is that each drawing is done on a newspaper clipping about the subject of the portrait. His choices are diverse and interesting – and always timely. Check them out at (http://www.roibal.net/blog/)

This is the opinion of the columnist and in no way reflects the opinion of ABC News.

Michael S. Malone is one of the nation's best-known technology writers. He has covered Silicon Valley and high-tech for more than 25 years, beginning with the San Jose Mercury News as the nation's first daily high-tech reporter. His articles and editorials have appeared in such publications as The Wall Street Journal, the Economist and Fortune, and for two years he was a columnist for The New York Times. He was editor of Forbes ASAP, the world's largest-circulation business-tech magazine, at the height of the dot-com boom. Malone is the author or co-author of a dozen books, notably the best-selling "Virtual Corporation." Malone has also hosted three public television interview series, and most recently co-produced the celebrated PBS miniseries on social entrepreneurs, "The New Heroes." He has been the ABCNews.com "Silicon Insider" columnist since 2000.