Moore’s Law could stay on track with extreme UV progress

After years of delays, new technology could come online by 2015.

Long-awaited improvements in photolithography could pave the way for the continued shrinking and scaling of microprocessors into the second half of this decade and beyond.

Moore's Law—which says that transistor densities double every 18 to 24 months or so—is not some inevitable consequence of physics. Rather, it's an observation of the way the semiconductor industry has evolved: the investment and technological progress that companies like Intel have made results in an approximate doubling of transistor densities on a regular basis.

Rather than causing the doubling, physics is currently threatening to put an end to the progress we've seen over the last four decades. Microchips are made with a process called photolithography. The silicon wafer on which the chip is built is coated with a light-sensitive layer ("photoresist"). Light is then shone through a patterned mask onto the wafer, essentially burning away the photoresist in the exposed areas. This exposes some parts of the wafer, leaving others covered. The exposed parts are then etched away, and the remaining photoresist washed off.

The limiting factor is the size of pattern that can be created on the photoresist layer. Higher transistor densities require finer mask patterns and shorter light wavelengths. Here's the pressing issue: current photolithography uses ultraviolet light with a 193nm wavelength, but at some point in the near future, probably around the 10nm process, a switch to extreme UV (EUV) with a 13.5nm wavelength will be required.

In the late 1990s and early 2000s, there was confidence within the semiconductor industry that EUV equipment was coming soon. It has failed to materialize, however, due to the technical difficulties that EUV imposes. Optically, EUV is harder to work with. It precludes the use of lenses, as most optical materials strongly absorb EUV light. Instead only mirrors can be used.

Further, it's hard to generate EUV light. Two methods are used: discharge production passes a large current through a metal plasma or laser production heats droplets of tin with a high intensity laser beam. Neither method is very energy-efficient.

But good news could be on the horizon. ASML, the world's largest supplier of photolithographic equipment, has said (via HotHardware) that it could have production-ready commercial equipment by 2015, suitable for producing chips with 10nm features.

ASML's prototype NXE3100.

ASML

However, the company still has work to do to meet that goal. Its current prototype machine, using lasers to produce the EUV, produces about 55W of light. Commercial hardware will need to produce about 250W of light. (As if to highlight the difficulties that EUV has faced, ASML once expected to have 80W machines by the end of 2011.) Intel says that it needs 1kW light sources for its lithographic needs.

The problems with EUV photolithography could derail Moore's Law. Although there are workarounds that extend regular UV photolithographic techniques, such as using multiple exposures with different masks, these have challenges of their own (due to the tight alignment requirements of the different masks). The more masks and exposures used, the higher the chance of a misalignment ruining the wafer.

Current fabrication uses double patterning with two exposures. In the event that EUV hardware can't reach the power levels required, quadruple patterning is the next step, even with its attendant difficulties.

However, the company is still has work to do to meet that goal. Its current prototype machine, using lasers to produce the EUV, produce about 55W of light. Commercial hardware will need to produce about 250W of light.

"Still has work to do" seems like a bit of an understatement. I didn't catch how they're going to bridge that seemingly significant gap from the article.

I want to see that factory making this prototypes, that is a lot of physics and chemistry, mechanics,robotics, cleanroom air technology and whatever else interesting high tech, it must be really cool to work and invent on that machine!

The industry needs 250W... but Intel needs 1KW. That was a little confusing.

Different metrics for different kinds of work. Intel says 1kW is basically endgame. 250W gets you the requisite number of wafers per hour to make the whole thing start working from a basic production standpoint.

You need 250W just to get volumes up, but Intel says 1KW for total replacement.

It's good to see they can get it down, but Moore's law is about to end.

Lithography has experimentally gotten down to about 3 nanometers, sort of, it's a destruction and self incorrect reconstruction of graphene, so at least its patterning. But while 3 nanometers is around 64 times smaller than we have now, it's also about two graphene atoms across.

So if Moore's law stays on track, were done by the mid twenties, as physics wont allow us to get any smaller. So in order for use to get any faster about a decade from now, we'll need new materials. And not graphene or any other semi-conductor. Because the bottleneck for performance beyond a certain frequency of gigahertz increasingly becomes the speed of electrons, which simply aren't fast enough. You just sit there waiting for electrons to move into their positions before you can do anything.

Long story short, we'll need all optical computers in a little over a decade or we'll be kinda stuck in terms of compute power.

However, the company is still has work to do to meet that goal. Its current prototype machine, using lasers to produce the EUV, produce about 55W of light. Commercial hardware will need to produce about 250W of light.

"Still has work to do" seems like a bit of an understatement. I didn't catch how they're going to bridge that seemingly significant gap from the article.

Many long years ago I remember a similar problem. The 1000nm barrier was looming and the tech was just not up to crossing it. That seems a bit weird to kids today, but that is what happens when you throw enough money at a problem.

The real and uncrossable limit is approaching fast. When feature sizes go subatomic I will really be curious to see how they did it.

I find the microprocessor an amazing machine and am glad for the electronics that I have. I do wonder when people will feel that they have reached a peak in how much processing power they need for consumer electronic devices such as computers, AppleTvs, Game consoles, etc. I mean, I really do not need a faster computer or faster phone. I think servers, cell equipment, and getting high speed to rural citizens could be improved. Then what?

Its kinda of like my feelings on TVs. How clear can they get before its not worth it. I don't need nor enjoy any tv in my living room bigger than 55in. I don't want 3D nor a hologram tv. TVs are going to peak for me in 5 years or so.

Don't even get me started on robot cars or crap like that. If I am driving less than 150mph and on a road, I can do it myself.

IMHO I would much rather put fantastic technology advancements toward space exploration instead of messing with feel-good consumer electronics.

And 640k of RAM is all anyone will ever need*

Seriously, though, faster processors with lower power consumption will eventually lead to things we haven't even considered yet, or things that we currently think are impossible. I agree that for what most people do with a computer these days (surf the web, create/edit documents, etc) current hardware provides a quality experience. It's those things we don't even know we *want* to do yet that require continued innovation.

The industry needs 250W... but Intel needs 1KW. That was a little confusing.

Different metrics for different kinds of work. Intel says 1kW is basically endgame. 250W gets you the requisite number of wafers per hour to make the whole thing start working from a basic production standpoint.

You need 250W just to get volumes up, but Intel says 1KW for total replacement.

IMHO I would much rather put fantastic technology advancements toward space exploration instead of messing with feel-good consumer electronics.

Except that Space Exploration means better rockets, simulations of all kinds of complex things, developing new engine technologies etc etc... all of which is very reliant on having fast computers available.

And let's face it, which will benefit most people more, a better computer, or knowing what the atmosphere is like on a planet that's a lifetime away? I think they're both awesome, but I know which of those pays my bills.

The best conversion efficency that Cymer (the company making the production EUV light sources) has achieved is 3% so a 1 kW EUV source would require a 33.33 kW CO2 Laser, something that should equal around 400 kW electrical power + cooling.

I wonder to what extent Moore's Law has been a self-fulfilling prophecy. Originally, of course, it was just an observation. But since then it has set expectations. Have those expectations pushed people to develop faster (at higher cost) than they would have otherwise? (Or, conversely, I suppose, they might have lowered expectations below what's achievable, although progress seems like it has been pretty fast, so that seems less likely.) Double and quadruple masking seem like examples of outrageously difficult workarounds just to meet very high expectations.

I find the microprocessor an amazing machine and am glad for the electronics that I have. I do wonder when people will feel that they have reached a peak in how much processing power they need for consumer electronic devices such as computers, AppleTvs, Game consoles, etc. I mean, I really do not need a faster computer or faster phone. I think servers, cell equipment, and getting high speed to rural citizens could be improved. Then what?Its kinda of like my feelings on TVs. How clear can they get before its not worth it. I don't need nor enjoy any tv in my living room bigger than 55in. I don't want 3D nor a hologram tv. TVs are going to peak for me in 5 years or so.Don't even get me started on robot cars or crap like that. If I am driving less than 150mph and on a road, I can do it myself.IMHO I would much rather put fantastic technology advancements toward space exploration instead of messing with feel-good consumer electronics.

And 640k of RAM is all anyone will ever need* Seriously, though, faster processors with lower power consumption will eventually lead to things we haven't even considered yet, or things that we currently think are impossible. I agree that for what most people do with a computer these days (surf the web, create/edit documents, etc) current hardware provides a quality experience. It's those things we don't even know we *want* to do yet that require continued innovation. *Yes, I know he never actually said that

I thought of that and you are probably right about innovation. I just kinda think that if we think small it will hurt big technology when it really counts.

IMHO I would much rather put fantastic technology advancements toward space exploration instead of messing with feel-good consumer electronics.

Except that Space Exploration means better rockets, simulations of all kinds of complex things, developing new engine technologies etc etc... all of which is very reliant on having fast computers available.And let's face it, which will benefit most people more, a better computer, or knowing what the atmosphere is like on a planet that's a lifetime away? I think they're both awesome, but I know which of those pays my bills.

Maybe, but I don't think faster computers are the the technology that is holding us back.You kinda circled around to my point, that at some point soon, small gadgets will not really make our lives better nor pay the bills.

I find the microprocessor an amazing machine and am glad for the electronics that I have. I do wonder when people will feel that they have reached a peak in how much processing power they need for consumer electronic devices such as computers, AppleTvs, Game consoles, etc.

One example off the top of my head is AIs in video games, especially in RTS. Current consumer computing cannot afford to run an AI that is even "Turing-unsettling". When professional Starcraft players can practice their matchups versus AI, I'll start to hop on your bandwagon.

Wake me up when the feature width gets smaller than an atomic radius. /s

It's amazing how we always find a way. Unfortunately, there is a bottom somewhere, although it may be distant still. So, we'll always find a way... until we don't. And then we'll head in a different direction and call it a win.

Except that Space Exploration means better rockets, simulations of all kinds of complex things, developing new engine technologies etc etc... all of which is very reliant on having fast computers available.And let's face it, which will benefit most people more, a better computer, or knowing what the atmosphere is like on a planet that's a lifetime away? I think they're both awesome, but I know which of those pays my bills.

Maybe, but I don't think faster computers are the the technology that is holding us back.You kinda circled around to my point, that at some point soon, small gadgets will not really make our lives better nor pay the bills.

My smartphone is probably 32nm process node, and it's sure made my life better. We just got some Haswell i7's at work, and they're nice and quick, but we do 3D graphics and I can tell you that while they pay the bills, even faster would let us do more work in less time. Bring it on!

So we haven't reached the point yet where process shrinks don't pay the bills for people

I find the microprocessor an amazing machine and am glad for the electronics that I have. I do wonder when people will feel that they have reached a peak in how much processing power they need for consumer electronic devices such as computers, AppleTvs, Game consoles, etc.

One example off the top of my head is AIs in video games, especially in RTS. Current consumer computing cannot afford to run an AI that is even "Turing-unsettling". When professional Starcraft players can practice their matchups versus AI, I'll start to hop on your bandwagon.

Is that microprocessor speed or programming? AIs are just not as resourceful as some brains. Barring computers becoming self-aware and learning, I think AIs will always be a step behind some brains out there.

IMHO I would much rather put fantastic technology advancements toward space exploration instead of messing with feel-good consumer electronics.

Except that Space Exploration means better rockets, simulations of all kinds of complex things, developing new engine technologies etc etc... all of which is very reliant on having fast computers available.And let's face it, which will benefit most people more, a better computer, or knowing what the atmosphere is like on a planet that's a lifetime away? I think they're both awesome, but I know which of those pays my bills.

Maybe, but I don't think faster computers are the the technology that is holding us back.You kinda circled around to my point, that at some point soon, small gadgets will not really make our lives better nor pay the bills.

You're biggest mistake is not realizing that advancements in your "small gadgets" can also be used to benefit those doing research into space exploration (designing/simulating better rockets, better guidance systems, smarter AI capable of handling more complicated tasks in real time, instead of waiting minutes+ for the round trip time of a human to react), medicine, cleaner/better energy sources, virtually everything (I'd challenge someone to come up with a field of science that would not benefit at all from more compute performance!).

Computers themselves aren't really holding us back (more never hurts), but that's the wrong way to think about it. Computers are a force multiplier. We wouldn't be able to do the sort of pure science we do today without our computers. And while you might think computers can do anything instantly (if all you do is play Candy Crush), anyone using their computer for compute will tell you that they'd love to have a much faster machine (preferably with infinitely more RAM ).

While Intel's development of Broadwell won't cure cancer, Broadwell CPUs could very well help speed up simulations being done by people researching cures. And over the generations of CPUs (and everything else), they can construct bigger, more complicated (and useful!) simulations, potentially in less time, which will help avoid dead ends on things that are far harder to speed up (actual trials).

Saying that computers are fast enough as they are today is horribly shortsighted.