Single-atom transistor is 'perfect'

Feb 19, 2012

This is a single-atom transistor: 3D perspective scanning tunnelling microscope image of a hydrogenated silicon surface. Phosphorus will incorporate in the red shaded regions selectively desorbed with a STM tip to form electrical leads for a single phosphorus atom patterned precisely in the center. Credit: ARC Centre for Quantum Computation and Communication, at UNSW.

In a remarkable feat of micro-engineering, UNSW physicists have created a working transistor consisting of a single atom placed precisely in a silicon crystal.

The tiny electronic device, described today in a paper published in the journal Nature Nanotechnology, uses as its active component an individual phosphorus atom patterned between atomic-scale electrodes and electrostatic control gates.

This unprecedented atomic accuracy may yield the elementary building block for a future quantum computer with unparalleled computational efficiency.

Until now, single-atom transistors have been realised only by chance, where researchers either have had to search through many devices or tune multi-atom devices to isolate one that works.

"But this device is perfect", says Professor Michelle Simmons, group leader and director of the ARC Centre for Quantum Computation and Communication at UNSW. "This is the first time anyone has shown control of a single atom in a substrate with this level of precise accuracy."

This video is not supported by your browser at this time.

In a remarkable feat of micro-engineering, UNSW physicists have created a working transistor consisting of a single atom placed precisely in a silicon crystal. Credit: UNSWTV

The microscopic device even has tiny visible markers etched onto its surface so researchers can connect metal contacts and apply a voltage, says research fellow and lead author Dr Martin Fuechsle from UNSW.

"Our group has proved that it is really possible to position one phosphorus atom in a silicon environment - exactly as we need it - with near-atomic precision, and at the same time register gates," he says.

The device is also remarkable, says Dr Fuechsle, because its electronic characteristics exactly match theoretical predictions undertaken with Professor Gerhard Klimeck's group at Purdue University in the US and Professor Hollenberg's group at the University of Melbourne, the joint authors on the paper.

Hydrogen atoms were removed selectively in precisely defined regions with the super-fine metal tip of the STM. A controlled chemical reaction then incorporated phosphorus atoms into the silicon surface.

Finally, the structure was encapsulated with a silicon layer and the device contacted electrically using an intricate system of alignment markers on the silicon chip to align metallic connects. The electronic properties of the device were in excellent agreement with theoretical predictions for a single phosphorus atom transistor.

It is predicted that transistors will reach the single-atom level by about 2020 to keep pace with Moore's Law, which describes an ongoing trend in computer hardware that sees the number of chip components double every 18 months.

This major advance has developed the technology to make this possible well ahead of schedule and gives valuable insights to manufacturers into how devices will behave once they reach the atomic limit, says Professor Simmons.

Related Stories

(PhysOrg.com) -- Scientists have literally taken a leap into a new era of computing power by making the world's smallest precision-built transistor - a "quantum dot" of just seven atoms in a single silicon ...

(PhysOrg.com) -- Single atom quantum dots created by researchers at Canada’s National Institute for Nanotechnology and the University of Alberta make possible a new level of control over individual electrons, ...

Atoms at the ends of self-assembled atomic chains act like anchors with lower energy levels than the “links” in the chain, according to new measurements by physicists at the National Institute of Standards ...

In an effort to put more science into the largely trial and error building of nanostructures, physicists at the Commerce Department's National Institute of Standards and Technology (NIST) have demonstrated new methods for placing what are typically unrul ...

A scientist at the University of Liverpool has helped to create the world's smallest transistor - by proving that a single molecule can power electric circuits
Dr Werner Hofer, from the University's Surface Science Research ...

Recommended for you

Scientists at the Paul Scherrer Institute and ETH Zurich (Switzerland) have created 3D images of tiny objects showing details down to 25 nanometres. In addition to the shape, the scientists determined how ...

Scientists working at the National Institute of Standards and Technology and the National Institutes of Health have devised and demonstrated a new, shape-shifting probe, about one-hundredth as wide as a human ...

Researchers have developed a novel technique for crafting nanometer-scale necklaces based on tiny star-like structures threaded onto a polymeric backbone. The technique could provide a new way to produce ...

There's more to quantum tunneling than meets the eye – or rather, the visualization technique. (Quantum tunneling is a quantum mechanical phenomenon where a particle transitions through a classically-forbidden ...

The breakthrough findings, reported in the journal Nature, allow better understanding of the counterintuitive behaviour of water at the molecular scale and are important for development of more efficient techno ...

User comments : 24

It is predicted that transistors will reach the single-atom level by about 2020 to keep pace with Moore's Law, which describes an ongoing trend in computer hardware that sees the number of chip components double every 18 months.

Well, with or without one-molecule transistors, that will only make a plus-minus 18 month difference into the moment we really will see Moore stopped. While that will truly be the proverbial Interesting Times(tm) for mankind, I hardly think it will be much fun. After half a century of continuous exponential progress, and the ensuing decent prices, we suddenly face an entirely new landscape.

Not to mention Microsoft, whose entire history is based on cavalierly wasting whatever horsepower people have scraped together the money for. Now they, all of a sudden, have to create the next Windows without it being much slower (in absolute terms), needing more memory and more CPU resources. Redmond will look like an ant nest after a big poke.

While that will truly be the proverbial Interesting Times(tm) for mankind, I hardly think it will be much fun. After half a century of continuous exponential progress, and the ensuing decent prices, we suddenly face an entirely new landscape.

why not? what about the landscape do you expect will make it not "fun", as you say?

Sorry gwrede. The "Interesting Times" can't be trademarked. It has been in the Chinese curse "may you live in interesting times" for thousands of years, if anyone bothers to look beyond English literature.

If this kind of stuff keeps up, there's going to be a whole army of us Electrical Engineers on the way back to night classes! A couple weeks ago it was the IBM team, now this! This news won't make my boss very happy, because our entire design group is probably going to descend on the Chief Engineer's office this week demanding tuition remuneration for the new classes we'll all want to take.

why not? what about the landscape do you expect will make it not "fun", as you say?

For the past 15 years or so, software companies, especially Microsoft and their OS, have tended to simply rely on hardware getting exponentially more powerful.

As a result they've bloated their software with countless inefficient applications and far from optimized code.

This is why your computer is like 50 times faster than in the 1990's, yet the operating system takes longer to load than ever before. A lot of other software is the same way.

The other thing is that once PCs and smartphones stop being twice as good every 18 to 24 months, people will only buy new ones when the old ones break down. this means revenues to hardware R&D and manufacturing firms will be greatly decreased, and may create problems in funding next generation technologies such as photonics or spintronics.

The good news, in 2020, your smartphone will have as many processors and ram as an Intel Server does in 2012.

This is great work that will offer insight into devices of the future and the rules governing them, but will not itself usher in a new era of devices, perhaps for niche applications, but not for the masses.

For any of us to see this work in our own personal devices requires this technology to be mass produced, and if you're using an STM, that does not even approach mass production, a few orders of magnitude below even MBE.

Well, with or without one-molecule transistors, that will only make a plus-minus 18 month difference into the moment we really will see Moore stopped.

Moore's law does not have to stop at any point, at least not necessarily at 1 atom transistor level. Moore's law is an economics law, not an scientist theory. The industry have to fulfill the 18 months / double performance rule, and their research is planned to achieve that mark. If they go a bit slower their revenues will decrease. Going much slower means they go to bankrupt. In any case, making smaller transistors is only one of the possible ways to double performance/cost of a device (although it is the most famous one). Reducing manufacturing costs is another way. Using 3D layers of transistors rather than 2D is another method in development. And today, the cost/performance ratio is usually related with other factors like battery, screen, weight...which have huge headroom to improve, transistor size is not so critical

You can now build some nanosized computers that you can take as a pill and augment your brain. The nanozised computers will travel to the brain and connect to the neuron network. They will then help the brain access information not stored inside its neurons( access the internet).

A key piece of information was left out of this article. The device operates at milliKelvin temperatures (Liquid helium cooling). This makes it unlikely for civilian applications.

"I was rather hoping we'd go photonic" This is not practical due to the wavelength of photons that are non-ionizing. However, there was some interesting new work presented using the wave nature of an electron in graphene (It was in a recent issue of IEEE spectrum).

Hardware is by far not the only place inprovements in computing will come from. Just because they achieve the smallest architecture doesn't mean the implementation of it will be the best. They have used the same building materials for centuries, yet I would not dare to say the best "house" has been build that can be built from them. This just means future improvements will have to come from other sources, such as software, materials, architecture. Alot of todays computing ability has not come from hardware improvements. It has come from mathematics.

"Alot of todays computing ability has not come from hardware improvements. It has come from mathematics."

Try watching a fullHD movie on a 2~3 year old PC : no you can't do.Most of new PC can cope with fullHD movies mainly because of better hardware. Once you change the hardware the same software runs like hell.

One of the biggest factors besides bloated software hurting the performance of our computers is the CPU clock speed and the speed of the external DRAM.

If we can put enough memory on the CPU chip, running at CPU clock speeds, we can get rid of all the level 1 and 2 cache memory and all the logic involved to load and flush the caches, not to mention the branch prediction hardware.

The room taken on the die and the power consumed by all this circuitry is all useless overhead needed only because DRAM is so slow.

If the CPU was faster, using multicore CPUs wouldn't have any advantage.If all the space consumed by the multiple CPU cores and cache memory subsystems was replaced using single transistor high speed memory our CPUs would be faster, lower power and less expensive.

"Alot of todays computing ability has not come from hardware improvements. It has come from mathematics."

Try watching a fullHD movie on a 2~3 year old PC : no you can't do.Most of new PC can cope with fullHD movies mainly because of better hardware. Once you change the hardware the same software runs like hell.

Crack's bad, mmkay.

Maybe if you had said 10 years it would be more believable.

About the other part, hardware IS the key component that is required to push the envelope further, but software is the main driver, patiently waiting.

why not? what about the landscape do you expect will make it not "fun", as you say?

For the past 15 years or so, software companies, especially Microsoft and their OS, have tended to simply rely on hardware getting exponentially more powerful.

As a result they've bloated their software with countless inefficient applications and far from optimized code.

This is why your computer is like 50 times faster than in the 1990's, yet the operating system takes longer to load than ever before. A lot of other software is the same way.

The other thing is that once PCs and smartphones stop being twice as good every 18 to 24 months, people will only buy new ones when the old ones break down. this means revenues to hardware R&D and manufacturing firms will be greatly decreased, and may create problems in funding next generation technologies such as photonics or spintronics.

Well I'm waiting for when they can send information back in time, so a chip can have the computations done at the time they are requested. Instantaneous computation. Yeah I know it involves technology that doesn't exist yet, but I do remember reading about a bloke trying to achieve time travel with a circulating light beam and passing an electron through it. Not sure what happened to him. Maybe he's time travelling!!! :-)

Please sign in to add a comment.
Registration is free, and takes less than a minute.
Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.