Is Moore's Law Dead? Does It Matter?

Does US national security depend on the continuance of Moore's Law, or is innovation that addresses the realities of the technology business opportunity more important to progress and wealth creation?

Most of the time when I hear an end-of-the-world story, I just roll my eyes and move on. However, I heard a message at DAC this year that still has me thinking.

People have been talking about the end of Moore's Law for some time, but those discussions became a lot more urgent and heated at DAC in June. Many reasons have been postulated as to why Moore's Law might end, including not being able to overcome some physical limitation -- perhaps a design issue that is preventing the whole chip from being powered up at the same time. More recently the matter of cost has been raised, where it will become so expensive to design a chip at the next node that nobody will be able to afford it. The concern has been that, with fewer design starts using the latest technologies and lower chip volumes, manufacturers would then not invest in wafer fabs for the next technology.

I am not sure I fully get behind any of these arguments, but if we do stop making these advances what really happens? Is there no room for innovation if monolithically integrated devices cannot get more complicated? I am sure that some companies will be affected by this "crisis" as their commercial lead is contingent on being ahead of the design and fabrication curve rather than having the best design. Such an end may well transform our industry, but then we cannot expect the ride we have been on for 50 years to continue without some kind of change.

Robert Colwell, who works for DARPA, said at DAC that the end of Moore's Law would be a US national security threat. This is based on the assertion that if the US does not stay ahead of the rest of the world in terms of computing power and associated technologies, then the rest of the world will become as capable as the US and be able to do things without the US government finding out -- and they will be able to find out what the US is planning to do.

Similar assertions can and are made in terms of weapons, of course. My first reaction is a political one. Why can we not spend more time getting along with people so that this is just not an issue that we care about? OK, so I am idealistic and I understand that some people may not think this is realistic or pragmatic.

Does innovation die when we cannot create more complex devices? I hope this is not true. I hope that we would find ways to use our knowledge and the capabilities we have in better and more optimal ways, exploring different architectures where we have just accepted those in existence today because that is easier and faster. What about biological computing or coming up with computers that operate more like the brain rather than just accept that binary arithmetic is the way to go?

So, what happens if the whole world has equal access to technology? Does stability depend on one country having a bigger stick than everyone else?

Performance progress has traditionally been lead by advances in semiconductor process but if we are approaching a plateau (and I'm optimistic that progress will be challenging both scientifically and financially, not things grinding to a halt), then we still have algorithm and architecture in which to innovate. The challenges to advancing in semiconductor process also challenges incumbent RTL-based design since larger system level design perspectives will be required. Yes, that horse has been beaten for a long while but there is growth (starting from an admittedly small base) as more and more people "get it" and turn to ESL.

For the others to catch up to US in technology, the US would have to be essentially static for an extended period of time. You can be assured that is probably not the case. In fact, it is probably more likely the others will try to stay a few steps back from the US leading edge which they fear is fraught with unknown risks they cannot learn fast enough about.

I'd be more worried about something all of mankind gets stuck on, due to fundamental physics, like related to quantum or entropy.

Bert: well, I respectfully disagree with your conclusion. Moore's law leads to faster chips, but the military tends to use the most advanced chips -- which are more expensive even under Moore's Law. That favors the richer country, which economists say will be China in the not-too-distant future. If Moore's law ends, and chips stop get faster at that rate, that advantage is narrowed or -- in the extreme -- even eliminated.

Of course, the wealthier country could still benefit from other advanced technologies that would speed up processing outside the chip. And all major countries already have enough firepower to blow up the planet several times over, so this is probably academic.

On the civil liberties front, the demise of Moore's Law could limit the mass processing of big-data to the point where there is no instant analysis of individuals based on "total information awareness" programs. That would cut into the expansion of intelligence programs like the NSA's.

I agree with your assessment of a quantum computer. However, technology being designed to implement a quantum computer could easily transfer to classical computers. Particularly, the silicon photonics that Intel, BM, and several others are working on could facilitate a paradigm shift such as suggested by Kurzweil.

What if an optical chip-to-chip data bus was as fast as the on-chip data bus? Just as we today connect a component implemented in one area of the die to another component in another area of the die, a chip-speed optical bus would allow inter-die components. This would be a paradigm shift. Instead of distributed computing using clusters of individual and independent computers, we could treat individual dies as a single virtual die, or in other words a very large SoC. In addition, this virtual SoC would be scalable to fit in whatever power envelope was available. There are many possibilities, but this one seems quite feasible in the near future.

"So, what happens if the whole world has equal access to technology? Does stability depend on one country having a bigger stick than everyone else?"

The truth is precisely the opposite, peace takes place when there is an equilibrium of power, and that happens when there is equal access to technology, not when one country has a bigger stick than everyone else.

The American expert is right in seeking US advantage in high technology, but that is not necessarily in the interest of world peace :-) Others have to seek the same advantage and at some stage, they will realize that their interests lie in collaborating and cooperating rather than constantly seeking an advantage over the others. It's a process and we are nowhere near maturity....

Could not agree more. I say bring it on! Our over-reliance on transistor level improvements over 30 years or so made us LAZY, but what's 30 years in the history of human progress? nothing. We ought to look at other levels including:

- Algorithmic: for many decades now, our way of thinking about problem solving has been biased towards Von-Neuman implementation platforms with semiconductor chups. Let's look beyond that and devise new algorithms for wider platforms and paradigms.

- Architectural: Hardware is not just about transistors, it's about computing and communication designs and architectures. I do not think we have explored the realm of possibilities here adequately, there is still a lot to be done.

-Physical: Binary electronics using semiconductors is one of many possibilities for computing, storage and communication. Here again, we have scratched the surface.

To solve our computing, storage and communications needs, we must train a new breed of scientists and engineers. Out with the modularization, fragmentation and specialization of training and teaching, and in with holistic education.

I believe that Moore's law as *conventionally* stated will stop (transistors shrinking and # of transisitors doubling every N years).

However, I feel that there will be innovations that will help continue forth with improvements in performance, power consumption and functionality (recall that these are the *end* objectives that we are really interested in. Scaling to smaller dimensions has just been the *means* of achieving this *end*).

And I feel that these innovations will enable using the same or maybe slightly *longer* channel length transistors (say, 45 nm) than where we are headed towards (longer channel lengths imply better yields), and yet result in better performance and lower power. These innovations may be in the form of using a different material than Silicon, etc.

Before you can declare Moor'es Law dead, consider what ti reall is, a prediction. Then there's law. Take Ohm's law. It seems to be irrefutable. V=IR. It seems to be universal, even in space. Then there are "laws" that governments pass. They can be revoked, just as Moor'es "law." Law passed by governments are really more rules than laws. If it can be revoked, it's not a law in the first place.

Too many contradictions in these arguments, Tom. If Moore's Law continues to hold, it means that very soon everyone can afford these faster chips. Deep pockets or no. If Moore's Law stops or slows down, the faster technology will become expensive, available only to those with deep pockets. That seems to be what's happening already.

The effect on "national security" has to be that the richer a country is, the better off it is militarily *without* Moore's Law!!

Besides, I think that the term "national security threat" is being thrown about too loosely, used to justify way too many questionable things lately.