Graphene is Next

May 3, 2010Uncategorized

By: Valkyrie Ice

Graphene. If you’ve never heard about it, don’t worry, a lot of people haven’t, because it’s really only been “discovered” relatively recently, and most of the truly interesting news about it has been in the last year. The amazing thing is that we’ve actually been using it for centuries, in the form of the common pencil. Graphene is a form of carbon, much like carbon nanotubes and other fullerenes, with one major difference. While fullerenes are 3D structures of carbon atoms, graphene is a flat sheet. It’s a 2D lattice of carbon with bonds as strong as diamond. It’s this sheetlike nature that makes it so useful in a pencil. As you write, individual planes of graphite are sheared off the end and deposited on the paper. Those individual planes are pure graphene.

See Also

By now, most of you are familiar with carbon nanotubes, a.k.a. CNTs, and their potential for computers. Graphene has equally amazing properties, including some that might make it far more readily usable than CNTs. First, like CNTs, graphene is capable of conducting electricity with much less resistance than copper. That alone makes it useful, but graphene has even more interesting properties. As New Scientist reports bending graphene creates strains between atoms that can create isolated pathways which then act as nanoribbons — wires — within the still connected sheet. In other words, the morphology of graphene affects its electrical properties: change the flat sheet by bending parts of it, and you change how electricity flows through it.

But that isn’t all. The pattern of carbon bonds has effects as well. Graphene is a hexagonal grid of carbon, much like a roll of chicken wire. Remove one random atom from the pattern every so often, and graphene can exhibit magnetic behavior without needing the presence of magnetic metals. Adding hydrogen into the mix creates graphene’s non-conductive cousin, graphane. Taking precisely defined patterns of atoms out of the sheet can create well-defined circuits, creating wires that are almost superconducting.

All of these properties make graphene a very important material for the future of electronics. It has already been used to create field effect transistors, the primary component of a computer processor. When you combine this with the other features above, you have a single material that could be used for the majority of the components in every electronic device we currently have… with one major difference: speed. Current silicon based chips have a limited speed at which they can run at room temperature without overheating and malfunctioning. Go much over 3GHz without some major cooling and chips melt down. But replace those chips with graphene equivalents — without having made any other changes to the circuits -—and you can raise that limit much higher. Potentially 100 to 1000 times higher.

Let’s think about that for a moment. That’s 300GHz to 3000GHz or 3Terahertz.

That’s a jump of two or three orders of magnitude up the exponential curve, my friends, especially when you combine it with the advances in multi-core technology and parallel computing. We’re talking about that smartphone in your pocket having a thousand times the computing power of your desktop PC, but using no more power than it does right now. The resistance of graphene at room temperature is so much lower than copper and silicon that even though it’s running at 1000 times the speed, it’s not using any more current, or wasting any more energy as heat than an identical silicon device, and that’s without considering any other possible advances in the field of electronics design.

We’re talking about that smartphone in your pocket having a thousand times the computing power of your desktop PC, but using no more power.

That big a leap in processing speed will simplify a lot of extremely complex tasks that require extensive amounts of data. From SETI searches for extraterrestrial intelligence to the search for all the ways a protein can fold, scientists use millions of processors in parallel to speed up research. A thousand-fold increase in computer speed could cut months to years off the time needed for their projects. The same goes for DNA sequencing, data mining, and a host of other areas.

And science will not be the sole benefactor. Most smartphones these days have the ability to use their cameras to create virtual overlays on the images that they see, a technique called Augmented Reality. AR has advanced to the point that it’s possible to create virtual characters in photos on your phone using nothing more than a 2D patterned target on the ground, or to create interactive “virtual assistants” in projected video that are capable of interacting with real world objects. Ultrafast computers will be essential for ushering in the age of Virtual Reality.

A massive increase in computer speeds is likely to benefit other complex computing tasks as well, such as real-time speech language translation. Right now, it is difficult to make these programs run quickly enough to be useful. A thousand-fold increase in computer speed could make brute force approaches a practical solution, enabling computers to crunch through entire dictionaries in milliseconds. It could make possible the elusive conversational interface that so many people believe will be the next step in operating systems. That speed will also be useful in the next generation of robotics, quite possibly bringing us a step closer to the kind of robots seen in movies like I, Robot or Star Wars. Ultrafast computers would enable a major reduction in the size of the computers needed to run some of the most complex robots we currently have, bringing the day of Rosie the Robot maid that much closer.

Obviously, ultrafast computers are going to have a very far-reaching effect on the way we do things, as well as how we interact with each other and our world, so the real questions are how practical is it to make graphene chips, and how soon can they be made? The answer is probably going to surprise you. Graphene has already been proven to be usable in current chip manufacturing processes with only minimal retooling needed. In fact, IBM has already created working 30GHz test devices using graphene transistors. In other words, graphene could begin making its way into computers as early as 2012 to 2015, and almost certainly by 2020.

Graphene, that same single-atom-thick layer of carbon that is a part of every pencil mark, is going to make all of this possible. Not bad for the humble Number 2, huh?

[quote] Quantum dots are crystalline molecules from a few to many atoms in size that interact with light and magnetic fields in unique ways. The size of a dot determines its band gap – the amount of energy needed to close the circuit – and makes it tunable to a precise degree. The frequencies of light and energy released by activated dots make them particularly useful for chemical sensors, solar cells, medical imaging and nanoscale circuitry.

Singh and Penev calculated that removing islands of hydrogen from both sides of a graphane matrix leaves a well with all the properties of quantum dots, which may also be useful in creating arrays of dots for many applications.

Their work revealed several interesting characteristics. They found that when chunks of the hydrogen sublattice are removed, the area left behind is always hexagonal, with a sharp interface between the graphene and graphane. This is important, they said, because it means each dot is highly contained; calculations show very little leakage of charge into the graphane host material[/quote]

Now, if you are a reader of science fiction, in particular that of Wil McCarthy, you will have read about a substance called WELLSTONE. Also called Claytronics as well as Programmable Matter.

Need I go on?

Now there are far more applications that will be exploited much sooner than the possibilities of wellstone, such as quantum dot transitors, LEDs etc. So imagine a carbon display with pixels smaller than the rods and cones in your eyes built into a contact lens.

Such an amazingly useful material carbon is. Graphane is Graphene with a layer of hydrogen bonded to each side. If you look closely at the top picture in the article, I do believe that it actually shows GRAPHANE (i.e. the little blue balls are hydrogen atoms.) This process creates hexagonal “wells” of conductive graphene (C no H) isolated from other wells via nonconductive Graphane (C with H). Combine this with Graphene’s other properties, and you can see where it could enable some amazing possibilities.

At Berkeley Lab’s Advanced Light Source, scientists working with graphene have made the first observation of the energy bands of complex particles known as plasmarons. Their discovery may hasten the day when graphene can be used to build ultrafast computers and other electronic, photonic, and plasmonic devices on the nanoscale. Understanding the relationships among these three kinds of particles—charge carriers, plasmons, and plasmarons—may hasten the day when graphene can be used for “plasmonics” to build ultrafast computers—perhaps even room-temperature quantum computers—plus a wide range of other tools and applications.

“The interesting properties of graphene are all collective phenomena,” says Rotenberg, an ALS senior staff scientist responsible for the scientific program at ALS beamline 7, where the work was performed. “Graphene’s true electronic structure can’t be understood without understanding the many complex interactions of electrons with other particles.”

The electric charge carriers in graphene are negative electrons and positive holes, which in turn are affected by plasmons—density oscillations that move like sound waves through the “liquid” of all the electrons in the material. A plasmaron is a composite particle, a charge carrier coupled with a plasmon.

Plasmons have been considered as a means of transmitting information on computer chips, since plasmons can support much higher frequencies (into the 100 THz range, while conventional wires become very lossy in the tens of GHz). For plasmon-based electronics to be useful, an analog to the transistor, called a plasmonster, must be invented.

Graphene used conventionally could be 3 to 10 THz. Graphene plasmonic computers could be 300 to 1,000 THz. Graphene plasmonic “wires” could carry data at the same speeds across continents.

I doubt this will make the amount of difference everyone is hoping for. The main bottleneck for computing is the transfer of information from a storage device to the processer. If you wonder why your computer is so slow to start up, its because your Hard drive is struggling to keep up with your OS dumping a bunch of files into your temp memory.

I can keep going as that is just stuff in the last 9 months or so. But it should be pretty obvious that the future of spinning media is extremely limited. Solid state memory devices will replace them within a few years. Memristors will eliminate the need for devices that boot, as even the loss of power won’t erase the data actively being processed. It will resume precisely where it left off when power resumes.

Additionally, the creation of an internet capable of running at the same speeds of data transmission is also well underway.

Thank you for the extensive list of supporting articles about concurrent developments in technology!

Solid state memory, graphene based systems, and many other new uses of old materials will fundamentally change the way electronics operate and even how we conceive of how electrical power relates to our ability to run data intensive processes. Beyond possible AR applications it will be amazing to see supercomputers that require very little power and work at speeds orders of magnitude past where we are at now.

is that we will soon be able to collect data so ubiquitously that for many applications we will work only at the concept level rather than having much if any involvement with the data. By that I mean we could create a program based in English that is using our abstract terms and selections of what sort of result we think we need to almost instantly run through thousands of formulae and models to select code or math objects that will produce those results. That may or may not make any sense, but as a psychology doctoral student my jargon isn’t so good for some of the technical concepts in my head.

You are basically describing a star trek computer interface. You lay out the abstract high level orders “Computer, I want a program that does a b and c.” and the computer creates a program automatically to fulfill your request.

Such high level design software is probably several years away still, but yes, ultrafast computers can certainly make such an interface somewhat easier to make.

No matter how much faster the media gets, no matter how much faster the CPU gets, there is still a limit. The media can go a decent bit. The CPU…not so much. A 1GHz CPU will pretty much run about 10x faster than a 100MHz CPU. But a 30GHz CPU is _not_ going to run 10x faster than a 3GHz CPU. And a 300GHz CPU will probably not have any noticeable difference from a 30GHz one. At least not without a fundamental shift in how we build PCs.

You can’t just crank up the clock speed forever and expect that everything will still work. 3GHz is about the limit for that. After that, the speed of light (or more realistically, speed of electricity) starts to play a role. Past 3GHz, you probably won’t be able to fetch data from RAM in one instruction. So you’ll spend some time sitting there waiting for data. As you increase the speed, you spend even more time waiting. And there’s a second barrier there, probably around 30GHz, where you won’t be able to move data from one part of the CPU to another within one clock cycle. The future is not higher clock cycles. Higher clock cycles are already nearly useless. The future is massively parallel computing and quantum computing. And memory, even hard drives, are going to go the same way. No matter how fast your hard drive is, if it’s connected to your CPU with 20cm of cable, it’s going to take around a nanosecond for that data to get there. Which sounds fast…until you realize that’s about 1GHz. Or .5GHz really since the data request has to travel the same distance.

You’re absolutely right… if we we’re talking about silicon. Or even if we were talking about computers pushing electrons around in circuit that allows electrons to move in 3 dimensions.

But… we’re not.

We’re dealing with an electron constrained to what is either a 2 dimensional “ribbon” or to a 1 dimensional “wire”. In both cases, the electrons travel at far higher speeds than they can in a 3 dimensional conductor, much closer to lightspeed than they can in current designs.. They also are extremely limited in the amount of “crosstalk” that can occur between circuits because of the constraints on electrons. (i.e. much less “quantum tunneling”) Rather than dealing with electrons as a “river” flowing through “pipes”, this is much closer to dealing with electrons as discrete particles. And that’s just for a start. There is also research into using graphene in “plasmonic” cicuits, which ignore particles altogether and deal with electrons as wave functions, which could achieve lightspeed.

This is a crucial difference, and it makes for a very LARGE difference in how compact a circuit can be.

Additionally, there are MAJOR changes in how computers will be constructed in the near future, as not only can graphene be used to make processors, they have been demonstrated to be able to form nearly every component used in electronics, from triodes, diodes, capacitors, coils, and even to be able to form magnetic devices. The “Motherboard” as well as the power supply, the HD, and even memory could all be packed onto a single chip, eliminating all that “cable”

I quite agree that parallel processors will play a massive role, not so much with quantum computers. They are currently at the stage of the old ENIAC, and need a lot more development prior to widespread use, which makes them a somewhat further down the road development than THz computers. I also think that memristors will play a very large role in the near future by eliminating the need for separate “memory devices”. Who will need ram or a HD when the processor IS both at the same time.

This article was written to show the LOWEST level of possiblity, in an effort to explain the potentials to a non computer technician or expert, and as such I chose to limit it solely to graphene’s potential in electronics, and not delve into alternate computer architectures, manufacturing techniques, or changes in our concepts of “processor, memory, and storage”.

Are there limits? Of course, but those limits are not those you listed, which hold true only for current silicon based architectures. TBH I think programming will be a bigger limit than the hardware itself.

See comment below. His point was, that current silicone technologies are limited, yes, but we’ll be moving away from them in the near future. One of the most limiting factors in current technology is the use of digital computing (strings of 1’s and 0’s) rather than the possible ANALOG computing which will be necessary to utilize faster technology. Storing only 1’s and 0’s is very inefficient. a single dot, in digital terms is either, say, black, or white. A single dot, in analog, can hold a plethora of data (analogous to a full spectrum of colors.) This alone would enable us to increase computing speed MILLIONS of times, but would require programming equivalent to that found in the human brain, rather than our primitive digital programming. This is similar to the way digital signal only approximates analog signal, where digital signal is lost where analog signal goes through. (Blatant plug: down with digital television!)

A high-performance top-gate graphene field-effect transistor (G-FET) is fabricated, and used for constructing a high efficient frequency doubler. Taking the advantages of the high gate efficiency and low parasitic capacitance of the top-gate device geometry, the gain of the graphene frequency doubler is increased about ten times compared to that of the back-gate G-FET based device. The frequency response of the frequency doubler is also pushed from 10 kHz for a back-gate device to 200 kHz, at which most of the output power is concentrated at the doubled fundamental frequency of 400 kHz.

IBM recently showed that graphene transistor can operate up to 100 GHz, and the group at Peking University believes that the material may even still operate well in the THz regime.

* * * * * * * * * * * * * * * * * *

“Low Parasitic Capacitance” should answer the question posed about it earlier, but there’s another detail made in this paper that is very important.

“a graphene based frequency doubler can provide more than 90% converting efficiency, while the corresponding value is not larger than 30% for conventional frequency doubler”

A typical frequency doubler loses 70% of the input signal’s power to heat. Graphene only loses 10%.

Frequency doublers are extremely useful in both lasers and radio applications.

Now I know I am growing old;I did not think moly-circs were possible
outside of Weber’s SF stories.

More generally, anone who tries to slow, let alone prevent, development
of new, wealth creating technology is cutting their own throat behind
their back; Getting really rich really fast is the only hope for the
global economy.

“getting rich really fast?” Get real. The only wealth is in having food to eat, a roof over your head to protect you from the elements, and something to wear so that your skin is protected, and most importantly having your health. Many agrarian societies have been MUCH “wealthier” than technological societies simply because there is no debt, and everyone had enough of everything they NEEDED, and there was no overcrowding, because you can’t grow FOOD where PEOPLE are over-concentrated.
Another point: give everyone in the world a million dollars, and I guarantee you, the cost of everything will jump by several thousand times. A dollar loaf of bread will cost a thousand dollars.
Your life savings will buy you NOTHING.
That’s not wealth. That’s stupidity.
We are poorer now, in our overabundant society, than we were in the society of scarcity in the depression.

Just a scant two hundred or so years ago somebody from patent office was worried that everything that can be invented/discovered had reached its limit and the patent office would have to be shut down.

This discovery and things coming out of this will undoubtedly shake the world yet again. This is a good thing. More and better supercomputers to find genes to kill cancer, kill bacteria and perhaps ease the aches and pains of old people.

Of course, more and better supercomputers also means we can get to porn and poker sites faster also.

I look forward to a time when someone can make some super computers using graphene chips, hook a million or so of them together, then turn the Phil Jones and Michael Mann loose and watch how quickly they can produce fraudulent AGW results. The future is going to be very exciting. They will create the hockey stick from h*ll.

Very interesting article. I looks as though graphene chips may be the successor to silicon. It is definitely time for a change, as I doubt ‘nanolithography’ using silicon will be useful for much longer in the quest to make better, faster, smaller chips.

The estimate of 2012 to 2015 for this to appear in computers is probably wildly optimistic. There’s currently no way to manufacture large ICs out of this stuff at the scale required for commercial production, or at any scale for that matter.

Further, the capital investments and timeframes required for leading-edge fabs ensures that even if a production process and ready-to-go designs (not trivial either) existed right now we still wouldn’t be looking at this in consumer goods until after 2015. Factor in those two problems, neither of which is even within the ~3-5year timeframe yet, and 2025-2030 seems more reasonable for these devices to be outperforming silicon devices by the margins described in the article.

Any time you have a repeating pattern you can look at it the way you do crazy eye 3d posters and get a 3d effect. In this case the higlighted hexagons jump off the page as the molucule move to the back.

“Physorg reports that Fujitsu Laboratories has developed a novel technology for forming graphene transistors directly on the entire surface of large-scale insulating substrates at low temperatures while employing chemical-vapor deposition (CVD) techniques which are in widespread use in semiconductor manufacturing.

Fujitsu Laboratories developed novel technology that allows for graphene to be formed on insulating film substrate via CVD at the low fabrication-temperature of 650°C, enabling graphene-based transistors to be directly formed on the entire surface of substrates. Although the test substrate employed was a 75-mm silicon substrate (wafer) with oxide film, the new technique is applicable to larger 300-mm wafers as well.

Fujitsu Laboratories also developed a process for forming transistors that use graphene as the channel material, as outlined in the picture. This process is independent of wafer size, so it can be applied to large-scale substrates.

1. First, an iron catalyst is formed into the desired channel shape, using a conventional photolithographic process.

2. Graphene is then formed on the iron layer via CVD.

3. Source and drain electrodes of titanium-gold film are formed at both ends of the graphene, thereby “fixing” the graphene.

4. Next, just the iron catalyst is removed using acid, leaving the graphene suspended between the source and drain electrodes, with the graphene “bridged” between the electrodes.

5. Using atomic-layer deposition (ALD), a method for forming thin films, a layer of hafnium dioxide (HfO2) is grown on top of the graphene to stabilize the graphene.

6. Finally, a gate electrode is formed on top of the graphene and through the HfO2, resulting in the formation of a graphene transistor.”

Wow, this is awesome. Now I just hope that the big companies involved in manufacturing and distributing silicon chips won’t roadblock the development of this new tech, and instead be proactive and adopt it as quickly as possible to banish the days of sluggish computing. I’m so glad I’ll be able to tell my kids about “the old days of slow computing, 100s of times slower than now” and they’ll wonder how we had the patience for such crude machines. I love technology!

The GHz wars weren’t all that long ago. The non-stop push to better faster lower power chips hasn’t slowed down at all. Moore’s law is the god of electronics.

IBM has already developed and tested 30GHz chips, and are trying to scale them to 100GHz speeds. Ultrahigh speed datapipes are also in development that could actually keep up with THz chips. AMD and Intel are neck and neck in trying to make faster smaller processors, with AMD fighting on the Graphics chip front with Nvidia for fastest GPUs.

Supression of this the absolute LAST thing any electronics company wants. The first one to get a THz Chip on the market would have a massive advantage over the others until they could catch up.

And that is just for the desktop market. It’s even worse among the cell phone market because there are more players in the game. And consoles are so competitive I expect to see bloodshed in their efforts to get the first THz game rig out.

No. The THz wars are going to make the GHz wars look like a tribal conflict compared to WW 2.

Once you start to pass the 3GHz barrier capacitance becomes a real problem. Heating is not the only problem with high frequency signals. At high frequencies transmission lines themselves become capacitors and cause time delay in the rise and fall time of the binary signals.

Memristors are simply a variation of transistors, and graphene has already been demonstrated to be as easy to use for them as more traditional transistors. Note in my article I only mentioned a 1 to 1 replacement of silicon to graphene computers. In my opinion, I doubt were going to see merely 1 to 1 replacements. Why would we? With Multi-core technology advanced to the point that a 100 core chip was recently released the to cell market, memristors speeding up processors by removing the lag between processor and memory, and the speed increases in switching to graphene, why would we need to reproduce our current chip technology in graphene when we could incorporate it as part of an ongoing advancement in computer technology?

Compared to what is really going to happen in my opinion once all factors are taken into account, this article is highly conservative, and limited to graphene alone.

Excellent article and thanks for the supporting discussion. I love to see real developments like this. Things are really moving.
I have enjoyed your comments in the discussions of other articles here and elsewhere so much that I have searched to find more. Thank you for being so positive and assertive! I look forward to reading much more from you.

And those laws are very well known by those making the chips. If the IEET believes speeds of 3 to 4 THz is possible, why should I assume that they are mistaken? It may be the case with SILICON, but since IBM has already made WORKING 30GHz chips, it would seem not to apply as much to Graphene, now does it?

The electrical properties of silicon and copper are the primary causes of capacitance, Graphene has very different electrical properties, as the various links in the article detail. For one thing, it is very nearly a room temperature superconductor. It is also capable of acting as conductor, semi-conductor, and insulator. It’s electrical properties are so conducive to electrical flow that they had problems making transistors with a high enough bandgap to work as logic gates. That problem has been overcome. A primary thing to remember is that in Silicon and Copper, electrons are moving in 3 dimensions. In graphene they are confined to a either a 2D plane, or a 1D string depending on configurations. This allows electrons to behave as if they have far less mass than in copper or silicon, and to move much faster. It also exhibits fractional quantum Hall effects.

Join Humanity+

Joining Humanity+ as a Full, Plus or Sponsor Member enables you to participate in Humanity+ governance and decision-making - an important role in the growing Transhumanist movement. It also, of course, gives you the opportunity to support us in the work Humanity+ does!