Hype-kill: Graphene is awesome, but a very long way from replacing silicon

This site may earn affiliate commissions from the links on this page. Terms of use.

Stories are hatched in different ways. Some spring from a journalist’s own imagination, some are passed along as tips. Some are based on press briefings or information released by a manufacturer.

Some stories happen because the author is annoyed — and this is one of them. Specifically, I’m annoyed at the way graphene announcements are often treated as reason for boundless optimistic predictions and declarations that critical problems have been solved. Graphene is an exciting substance, and the ongoing process of discovery is critical, but I respectfully disagree with my colleague Sebastian — there is no single “missing switch.” Just a whole lot of baby steps.

Yes, it's very pretty. Now leave it alone.

Right now, the research going into graphene has a great deal in common with the research done in the early days of the transistor. When the transistor was discovered, in 1947 at Bell Labs, the three scientists working on the problem knew they’d found something big, but refining the first transistor into a marketable product took years. What Bardeen, Brattain, and Shockley initially created, the point-contact transistor, is entirely different than what semiconductor manufacturers build today.

It took decades for researchers to hammer out the best ways of building transistors, and a lot of various materials, manufacturing methods, and transistor structures were created in the process. You’ve probably never heard of most of them, because CMOS, MOSFETs, and silicon won the day. It’s that same entrenchment that’s going to make them extremely difficult to dislodge.

Much of the research done on graphene to date focused on proving basic principles, like whether it could be used in a transistor. That’s important work, but it’s just one tiny step in an enormous process. The International Technology Roadmap for Semiconductors (ITRS) doesn’t talk about this sort of thing; their target audience is already aware of it. To understand the challenges facing any silicon replacement, you first need to grasp just how unique transistors are.

Here’s a replica of the first transistor (the original Bell Labs transistor is pictured above):

Companies like Intel, TSMC, and GlobalFoundries now pack several billion of those into an area smaller than your thumbnail, and we treat it as just another metric of overall yields. Familiarity has made us all a wee bit complacent. In reality, nothing — nothing else in human history has scaled like transistors. The closest human analogue was Donald Trump’s ego, and Intel pulled ahead in that race when it launched 22nm earlier this year.

Mass production of electronics shifted to silicon because it’s a balanced, inexpensive solution. The reason materials like silicon-germanium (SiGe), gallium arsenide (GaAs), and other III-V metals are now being considered again is because the advantages they’ve always offered over silicon may become more important to future scaling than the weaknesses that kept them on the sidelines for the past fifty years. Even so, those considerations are preliminary — innovations like strained silicon, pictured above, have extended the viability of silicon further.

The point is, the general capabilities and characteristics of these alternate materials are well known. Incorporating them won’t be simple or easy, but the gaps in current manufacturing knowledge and production capacity are just that — gaps. Graphene, in contrast, is a giant unknown. A silicon wafer goes through hundreds of steps between the time its sliced from a nugget and its final distribution as a microprocessor. During those steps, it’s etched, polished, bathed in chemicals, doped with impurities, heated, cooled, tested, and cut.

All those processes are handled by machines that’ve been tuned to degrees of precision that were literally impossible fifty years ago, and they kick out a lot of silicon. In 2012, estimated worldwide wafer-start capacity per month (in 200mm wafer-equivalents) was 13.62 million wafers. The current worldwide production of graphene wafers, also in 200mm equivalents, is only marginally above zero. Several universities have pioneered methods; Penn State announced it had successfully produced a 100mm graphene wafer in 2010.

That’s a critical step, but it’s just one step. The gap between simple graphene structures and the ability to build 50,000 Core i7-class processors a month with a defect density that’s competitive with modern silicon is as big as the gap between today’s Ivy Bridge and the original 8086.

If an Intel engineer walked into a boardroom meeting and said: “I’ve figured out how we can use graphene to drive another 20 years of classic Dennard scaling — but only if we spend $20 billion on new equipment, break ground on a $5-billion new facility, and send Jerry Sanders, ex-CEO of AMD, a video of Andy Grove dancing the Macarena in a wedding dress with Andy in a dress,” that deal would be signed in seconds. If Intel got exclusive patent rights, they’d probably toss in a video of Pat Gelsinger doing the congo in a thong. It’d be worth it.

Unfortunately, given what we know right now, here’s what that new facility would look like:

Even if we knew graphene could fully replace silicon, the tools to make that happen literally do not exist yet. Even if extant tools can be retrofitted, the facilities those tools reside in may themselves require modification to conform to new best practices. Semiconductor manufacturers can’t afford to stop and see if graphene can catch up with existing practices, which means the targets graphene needs to hit to qualify as a replacement technology are constantly changing.

If silicon is analogous to wood in terms of its flexibility, price, and utility, trying to build with graphene is more like using squid. That means it’s going to take awhile to figure out, and an awful lot of people are dubious on whether the end result will be worth the effort.

We might see simple graphene research prototypes in 2-3 years, commercial prototypes in 7-10. Manufacturers tend to use simple structures like SRAM or NAND when testing early process nodes, so that’s where we’d look first. As with OLEDs, you can expect a long hype window, and there are still plenty of issues that might prove unsolvable. The point, however, is that we don’t know, and we’re not going to know for awhile yet. Graphene could be the cornerstone of 21st century technology — or a niche product with intriguing characteristics but too many flaws for widespread use. That’s more or less exactly what people said about the transistor, and obviously that bet turned out pretty well.

But we’re not going to have graphene figured out next week, or even next year. It could easily be next decade. So in the meantime, could we have a bit of peace and quiet?

Tagged In

The excitement generated by Graphene is not in its production yields. Its the idea of a fundamental discovery of a unique property of Carbon, one of the most common of materials. It has properties that are so varied and useful that it has caused a great stir among scientists. It really does deserve the hype.
As for creating vast quantities of replacements for silicon, that is not the point. It does not need to…yet. Its the potential that counts. If and when its makes sense to start building using Graphene, there is enough brains and infrastructure worldwide to make it happen. It may come to nothing, it may be used in a few applications, or it may just change the world. Carry on, Sebastian.

some_guy_said

“Its the idea of a fundamental discovery of a unique property of Carbon, one of the most common of materials.”

Graphene is not that special, and potential means little. Fusion has loads of potential, and how long has that application been “20 years out”?

Graphene has no guarantee

Why aren’t you crowing about diamond? Diamond makes an even better semiconductor (Because it has something called a band gap…) – it is currently far more practical than graphene and yields similar benefits.

The point is that there is SO MUCH hype for something SO FAR OUT – it is not just ONE challenge that needs to be overcome, there are HUNDREDS of them. It is not an inevitable thing…

Your opinion does not invalidate my opinion. While graphene is neat, there’s no guarantee that it will indeed yield magical benefits.

Juzer Jangbarwala

I like this response. It is sad when non technical people, give such judgements and esoteric evaluations on such important points. Diamonds cannot be made into single layer sheets because the electron configuration is completely different from graphene. Silicon is just a POSSIBILITY. Current applications including composite plastics, low cost transparent OLEDs and 30,000 mile oil changes are a reality economically viable today. The world will have to produce a million tons of graphene just to satisfy a portion of this demand, and like many other discoveries, including the transistor, this momentum will accelerate the development of graphene to replace silicon. I for one, would pursue anything as environmentally friendly as graphene compared to the multiple toxic chemicals used and GHGs emitted by semi conductor industries.

VirtualMark

I agree, but unfortunately the articles on this site seem to be made for stupid people lately. We should remember that the writers are reporters, not scientists, and are just looking for attention grabbing headlines. The actual scientific community understands the importance of graphene which is why a nobel prize was awarded.

This article just states the very obvious – there are engineering challenges that need to be overcome before we use graphene to its full potential.

I was actually busy in this thread for a different reason but thought I’d drop back in to remark that two years after I wrote this story, we remain a fundamentally long way off from commercializing graphene. Graphene chip production is *still* expected after 2020.

VirtualMark

Ha, yes I did wonder why you were commenting on a 2 year old article. Thanks for the update!

Jeff Jones

Whine whine whine. You are just jealous because the time traveler gave a glimpse of the future to Sebastian and not you.

Courtney Best

Blathering Blatherskite!

When Shockley came up with the transistor there were probably only a few hundred
people who could understand it’s value… There are thousands and thousands of
engineers who already understand the value of graphene..

There are many times more people working in graphene than there were workingon integrated circuits.. There are many Corporations filingpatents on graphene technology..

Joel Hruska

Actually, no. It took the team at Bell a little while to understand what they’d built, but there was a *lot* of experimentation going on pretty quickly.

Also, let’s keep in mind that the scale and scope of the challenge have fundamentally changed. When Shockley built his first transistors, he built them by hand. Now you need a $15 million piece of equipment to etch channels barely wide enough for atoms to pass each other in the street.

If throwing brilliant people at something magically made discoveries happen, we’d be warping through space, living for millenia, and curing cancer like it was a mild case of strep throat.

As for your claim that patents are somehow related to meaningful scientific progress…I’ll let laughter speak for itself.

VirtualMark

Read some history books – you’d be surprised at what humanity can achieve. We didn’t always have the internet, for example.

Is it possible that at least some of the hype is from the investment community? Any public company working on graphite is likely being touted as a great possible investment by the many “insider knowledge” investment advice websites. I think some of those guys bought stock in some penny stocks with the word graphene in their prospectus, and are now trying to hype the world into buy the same stock so their shares increase in value… just a theory.

What the author of this article might be missing is that we no longer live in 1947.

Research and development occur at a much faster rate.

Our scientific knowledge and practical applications of it are at least 60 to 100 years ahead of anything we presently use.

Right now the industry uses outdated/inefficient materials and methods of production because it would be too costly from a $$$ point of view to upgrade (even though we have enough resources and technology to pull it off several times over on a global scale) – therefore, they have to conform to ‘cost efficiency’ and find ways to introduce new technology through those artificially created limitations.

The market also creates the least efficient technology at the start, and not the best of what a material is capable of that is in line with our latest scientific knowledge – because, companies need to look out for profits, and as a result, its more profitable to integrate features and create revisions of existing technology for a decade or two, instead of pushing it all out all at once from the start.

If we used superior synthetic materials that can be produced in abundance and state of the art methods of production for creation of tools/technology, developments would occur at far faster rates and technology would be light years ahead of anything we use right now (we have the ability to transform our society in less than a decade if there was a concentrated global effort – but social awareness needs to be brought up to speed, and historically, this was never the case because ignorance keeps people from asking questions and ingrains a notion that things are ‘impossible’).

However, even in the monetary system, prices of materials and newer (or should I say OLD) technology keep dropping faster, and their adoption to the market is faster… but at the same time, certain vested interests could keep those technologies from making an appearance for some time (but even this is diminishing due to overall awareness increase in the general population).

Still, with automation taking over large swats of production (most of production industry is already automated as is), its only a matter of time before it takes over completely (molecular manufacturing, robots and computers for specialized tasks will completely take over, and no one is irreplaceable).

Graphene was already applicable in various areas at least 2 years since its initial discovery (maybe not in electronics per say, but to augment/replace existing materials, for thermal dissipation or other properties, most definitely).

As I said… the article does’t take into account that things are developing at a faster rate today, and this is only accelerating.

Anon ymous

hype-unkill: http://nanotechweb.org/cws/article/tech/50429
lol this was released almost at the same time as your article

Far from it, as a material, graphene was discorvered in 2004. Even then its potential properties for electronics were well known. By 2006 (at most), they could have started using graphene in electronics in smaller quantities to improve existing chip capabilities such as efficiency, heat, etc.

Same goes for synthetic diamond chips. They were successfully made in 1996 in mass production and were cheap/relatively easy to produce. They could have started using it in electronics (at least partly) since 1998 to augment existing electronic capabilities and learn from those – or just experiment on existing chips for several years, increase quantities used and by 2000, cpu’s and gpu’s could have probably featured a cross-breed of silicon/diamond microchip thats at least 10 times better/faster than existing ones, using lower amount of power than current electronics and probably no need for active cooling (graphene is about 3x better than diamond in all areas).

Making cpu’s and gpu’s from pure synthetic diamonds would produce 40x more powerful computers that draw 1 tenth of power.
But that’s not the point. The point is that the market uses silicon (which is an outdated material) simply because its ‘cheap’ from a monetary point of view (has nothing to do with resources though).

Today’s technology is essentially 60 to 100 years out of date – because new materials are never used as soon as they are applicable (even in minute quantities) because the profit motive and fictional costs have to be considered first.

Whoever claimed that capitalism drives technology was speaking nonsense. It was never like that in the first place.
Companies use what is ‘cost effective’ (cheapest for THEM in terms of investment so they can gain large profits in the long run).

Joel Hruska

To quote from your article: “Their nano-FET used a silicon substrate as the bottom gate, with a thin
insulating layer of silicon dioxide between it and the stacked graphene
layers. A transparent layer of aluminum oxide (sapphire) lay over the
graphene bilayer; on top of that was the top gate, made of platinum.

What that means is that the researchers built a highly specialized structure using techniques that have not been scaled and materials that are non-standard. Does that matter? Sure. It’s progress. But it’s not a solution.

The profits to be made from being first mover on a reliable transition to any new technology (carbon nanotube, graphene, diamond, or peanut butter) is far greater than the capital outlay. More to the point, there are major companies in these realms that absolutely *could* lay out tens of billions if they thought it would secure them another 20-30 years of scaling and dominance.

They haven’t made such investments because the technology isn’t proven. It’s not clear it’ll ever be proven. And it’s not just me saying so — feel free to consult the tables and charts of the ITRS roadmap.

Yoduh99

I get it, the author sees himself as a tech guru journalist, and believes that he should set be the one who sets the bar for how “hyped” we peons should all get over new technology. Us mere mortal readers are simply living in a world of fantasy while he tries valiantly to yell the truth down to us from his intellectual high chair. Really we should be thanking him for crushing our dreams and putting us in our place. Now we can be as smug as him whenever graphene is brought up in conversation and we can go, “meh”.

Joel Hruska

Those of you who think I’m magnifying the difficulty of the situation are invited to consult the ITRS roadmaps:

This article explored the likelihood of graphene and nanotube adoption as well as the challengeds associated: http://www.extremetech.com/extreme/120353-the-future-of-cpu-scaling-exploring-options-on-the-cutting-edge/2

rsetbacken

Hi Joel, I was wondering where you got your number for 200mm equivalent wafers for 2012. (13.62 million wafers.) Kirk Hasserjian (AMAT-http://www.semiconwest.org/node/8491) put up a chart during the 450mm TechXSpot at Semicon that suggested it was around 8 billiion sq inches, which would be more than 10X your number. Any idea who is closer? Did I get my math wrong?

I can’t claim to have validated the ICInsights, but I wonder if Kirk was talking about yearly production vs. monthly? It looks like your link didn’t translate cleanly; I can’t click on it.

Assuming that it’s not a year/month issue, I’d have to pull out a calculator and run the math. An 8 inch wafer has 50.24 square inches of space…which would still come out to about 684 million square inches of wafer/month.

I don’t think your math is wrong, but I’d have to know more about where Hasserjian’s number came from to know where the difference is. I was fairly impressed with 13.62 million wafers/ month — another factor of ten seems like an awful lot of silicon, even for the modern world.

rsetbacken

It is a month year thing, I somehow missed your wafers/month units. Yes, that is a LOT of wafers, sort of hard to believe.

Kirk’s presentation was is on the semicon west website under the 450mm techXspot session.

Thanks!

Bob

Casecutter

Man this guy is a Techno-bummer. We’re all so happy with silicon… so we stop looking for the next thing. It’s un-American!

POET GaAs III-V technology is far ahead of any graphene developments. It is surprising at how unknown this new revolutionary technology is. POET is being used by the US Air force, NASA and under third party development by BEA Systems. it is curious as to why it has not been picked up by these big chip makers and covered by more analysts.

Don’t forget though that a ton of those baby steps are COMMON. e.g. they will probably need photolithography. Well, they have done the bulk of the work on that. And come to think of it, photolithography dominated the whole of what they were selling as advances for many years.

Don’t forget though that a ton of those baby steps are COMMON. e.g. they will probably need photolithography. Well, they have done the bulk of the work on that. And come to think of it, photolithography dominated the whole of what they were selling as advances for many years.

Guest

Judging from other articles in here, it appears many articles here are written from Intel’s perspective.

RB

I enjoy reading about the different applications of Graphene as the researchers discover new methods. Lengthy product development is inevitable, we’re not stupid. Hype can be a misleading thing but I’d rather we didn’t have “peace and quiet” on this subject, it’s interesting. How about you be quiet.

xvl260

Like the article said, graphene is not ready for prime time. But it’s important for semiconductor manufacturers to start the research now, even Intel had said they may have problem scaling silicon to smaller than 14nm (in 2020). So we got less than 10 years before we hit that wall. If any company can make a graphene CPU at reasonable prices, they will dominate the market.

Shawn Kearney

“Much of the research done on graphene to date focused on proving basic principles, like whether it could be used in a transistor.”

This seems like an odd statement when a year earlier a graphene transistor was produced by IBM, and a few month after made about 30% faster a cutoff frequency of 155gHz, making it the world’s fastest. Clearly graphene can be used as a transistor. About a half year after this article was published, a programmable graphene transistor was experimentally produced – each of these devices physically resemble what we think of as solid state electronic devices, no bell jar involved.

Accelerating change is a real phenomenon that makes rational sense. We use the information of the past to make discoveries in the future. As more information becomes available, the easier and faster new technologies can be realized. You cannot assume that we’re essentially starting over with organic transistors, because we understand very well how transistors work and how they are manufactured already. We can apply this information to future technologies, such as graphene.

fatheroftwo82

I give it about 30 years for the newest graphene pc. Imo

Bastão

graphene, like fulerene and nanotubes are excellent to create hypes and feed the science itself without any useful application.

This site may earn affiliate commissions from the links on this page. Terms of use.

ExtremeTech Newsletter

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.