Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

stupendou writes "Australian and American physicists have built a working transistor from a single phosphorus atom embedded in a silicon crystal. The group of physicists, based at the University of New South Wales and Purdue University, said they had laid the groundwork for a futuristic quantum computer that might one day function in a nanoscale world and would be orders of magnitude smaller and quicker than today's silicon-based machines."

That would have been well over 30 years ago, since 1500nm was reached in 1982 and 800nm in 1989.
The process size is virtually a straight line on a log10 scale. Going on the last 40 years we'll be at 1nm by 2030. Its an order of magnitude every 10 - 15 years

That was my point - back then creating working dies at 22 nm, which is as good as we can do right now really, would have been laughed at by some. "That's only 100 or so atoms! Good luck!"

The team doing this has demonstrated that they can be much more accurate with single atom placement than in the past, so I don't doubt we'll be building at the single atom scale in mass production eventually, and probably within my lifetime easily.

It should be interesting to see what happens in the next few years. SInce following the trend now leads to subatomic 200pm process sizes in 20 years or so. Apparently the current lithography technology has limits around 10nm.

No. The limit is a single atom. Not unless someone comes up with a way of making a transistor out of free quarks. We'd have to have some sort of breakthrough in physics to do that and that's not even on the horizon yet.

-theoretical ---we are not even here yet.-empirical-demo devices-prototype devices-production/commercial devices

It should be interesting to see what happens in the next few years. SInce following the trend now leads to subatomic 200pm process sizes in 20 years or so. Apparently the current lithography technology has limits around 10nm.

I'm not going to get behind the line on this one; I always seem to get lazy, and then lose out. Not this time, by gods! I'm placing my hold on "The Young Man's Illustrated Primer" at the library right now!

Ten years from now, who's to say we won't be able to mass produce them?

It is a pretty big jump from building a single demonstration / proof of concept device and connecting it and integrating it into a design that works reliably at speed. IBM seems to be getting some interesting results with a single atom DRAM [eetimes.com], but that is still way closer to a laboratory curiosity than an option for shipping silicon.

But that is just the Fab side of things. To actually design and build chips with this sort of technology is almost certainly going to require some serious upgrades to EDA tools.

That's what I said... that we've been able to build these things for ten years. As the article explains, the big difference here is the precision of the placement of the atom, making the devices much more manufacturable (though not on a mass-scale, of course).

And yes, there are other steps involved in making actual devices. But we don't have to work in a single pipeline. As the process engineers get closer to making this sort of thing mass producible, the software engineers will be simultaneously upgrading the EDA tools, and the design engineers will be thinking of ways to use this new device. It'll go into high price, low yielding devices at first. Probably military tech, or cutting edge instruments for physicists. Those pilot projects will be used to the design tools, tune the process, and maximize the yield.

It'll be quite some time before they reach consumer electronics, if they ever do, but I wouldn't toss them aside as non-manufacturable.

That's what I said... that we've been able to build these things for ten years.

Yes, and this is what you didn't say: and today we are still pretty much at the level of demonstrating / playing / investigating.

Did you lose interest after getting to the end of what you wrote?

Did you read how they did it?

The scientists placed the single phosphorus atom using a device known as a scanning tunneling microscope. They used it to essentially scrape trenches and a small cavity on a surface of silicon covered with a layer of hydrogen atoms. Phosphine gas was then used to deposit a phosphorus atom

We were making single atom transistors ten years ago, but it was hit or miss whether the atom would end up in the right place.

Today, we can place the atom with high precision, in silicon, so that the devices can be made reliably.

Ten years from now, who's to say we won't be able to mass produce them?

Wasn't aware of such progress. Do you have some citations I could examine? I'm aware we can "see" individual atoms using electron tunneling microscopy-and even manipulate them a bit. Thank you, very much

Ten years from now, who's to say we won't be able to mass produce them?

A little known fact about Moores law. People usually don't know this, but Moores law is actually an inverted bell curve, so a few years from now, circuits will actually start to grow bigger and bigger every year. In the future we will have computers as big as mt to perform the simplest tasks. Unfortunately the bottom of this bell curve occurs at the same time as the end of the Mayan calendar, so not to many people will be around to worry about it.

Massive nano-scale manufacturing is much closer to reality than you seem to assume. Look into it. No spoon on hand for me to spoonfeed right now, sorry.

Did you read how they did it?

The scientists placed the single phosphorus atom using a device known as a scanning tunneling microscope. They used it to essentially scrape trenches and a small cavity on a surface of silicon covered with a layer of hydrogen atoms. Phosphine gas was then used to deposit a phosphorus atom at a precise location, which was then encased in further layers of silicon atoms.

Does that seem like a scalable process to you? Here is what the article says:

MEMs devices [youtube.com], in contrast to nanoscale devices, are having a huge real world impact today and have been for some time. Nanomaterials are having an impact. Nanodevices... it looks to me like lots of laboratory work, lots of interesting projects, some fantastic demonstrations, but not much being manufactured or shipping as product.

As for the single atom transistor - interesting demonstration that is necessary for the development of future devices, but not even close to being manufacturable on any real scale.

You sound exactly like every naysayer we've had at every stage of progress. Just because _your_ imagination can't grasp how single atom transistors can be mass-produced doesn't mean _nobody_ ever will. If the history of our species teaches us anything, it's that once we've conceived of something as possible, it becomes a matter of when, not if. Or maybe you'd rather be stuck with banging rocks together?

With transistors that small, how would you harden a microchip against radiation? Would the extra redundancy not make it worthwhile. That is to say, is there an optimal compromise between transistor size and resources consumed through redundancy allocation?

Size is not really the point here as far as i see it (the actual machine running this is actually quite a bit bigger than a classical computer - disclaimer: i have not seen their experiment, but other ones that go in that direction). The point is that if you want to do quantum computation, you need quantum objects to do your calculations with -- something an atom is, and a huge piece of silicon generally isn't.

Do you need to? Are normal computers radiation hardened? I realize there are situations that do use that, like satellites, but they are always behind the curve technology wise due to the extra requirements for a harsh environment, so no problem with this, they could use more classic lithography.

Also you could just encase the chip/board/unit/whatever in something to resist radiation. I'm not saying that is a workable solution in all cases, but in many it would be just fine. Just shield the chip and call it good.

Exactly, I have no idea. I though all modern CPUs relied on some form of ECC correction. Certainly for the L1 and L2 cache at least. It's also why server memory uses ECC too. IMHO, I think all computers and handheld units should employ error correction well.

Google [arstechnica.com] performed a 2 and half year study of this topic. Worth reading as I'm sure it can be applied to CPUs if not transistor technology overall.

Yes. They are hardened against the normal background radiation that is ubiquitous. That's why there's more-or-less a minimum amount of energy that's required to change a single storage bit, otherwise it gets flipped too easily by a stray alpha decay from the chip's packaging. We entered the era where packaging is made from low-radiation materials some time ago to help with this, but it only helps, since existenace here on Earth is bathed in a certain level of radiation.

That isn't to say normal chips are hardened against abnormal levels of radiation, but they most certainly are designed with a given level of anticipated background.

Yes, but most useful circuits don't have just one transistor, do they? Modern processors have on the order of a billion transistors. By the time we can manufacture single-atom-transistor chips, they'll probably have well over 10 billion. Ionizing radiation will be a hurdle to overcome.

Normal computers aren't radiation hardened, but the point is that they store and process information based on more than just the quantum state of a single particle. It takes a great deal more unwanted energy to cause them to flip to the undesired state. This kind of thing would many times more vulnerable to stray radiation, heat or stray electromagnetic fields than the smallest conventional transistor.

But any practical computer is going to have to contain millions of these things. If you want to carry out a computation with such a machine, either you have to protect it with conditions that have a minimal chance of causing a computational error or you will have to engineer it with redundancy and error correction mechanisms that may in the end be bigger and less reliable than a classical solution.

How can they it's not really silicon based? I'm assuming that at least part of what gives the phosphorus its transistor like abilities is the fact that it's embedded in the silicon in the first place? Or am I missing something?

Pretty much - that's how transistors work. The phosphorous has a extra electron (compared to the silicon) and the combination forms an extrinsic semiconductor, which you then use to make junctions and transistors and diodes etc.

Just having the phosphorus atom isolated doesn't do much for you, so I think the article is referring to "silicon based computers of today" without really thinking about it properly - you still need to dope it to make it useful for making computer chips, despite it already being an intrinsic semiconductor.

The system is indeed silicon based. P impurities are placed with atomic control on top of a Silicon surface in a very controlled patters. Wires conducting carriers to the single impurities are made of the same process. Gates modulating the transistor are alos made of the same process. The physics of the device operation is a bit different than room temperature operation of typical impurities that are being referred to below. Here the single impurity acts like a quantum dot, like an artificial atom that

It's possible to envisage people complaining about their googleplex nand drive hitting the "atom" size limit in a few years time when moore's law "rice board puzzles" up the spec. "The hologram shooter is just not realistic unless you calculate the position and momentum of every single molecule." Impressive...

(also, for the record, not that it matters, I'm a scientist. I am well aware that 22/7 is a rational number, as is any representation of Pi that write down (short of the actual symbol itself, or the definition in words). I didn't think it was necessary to qualify that I knew that fact for a simple wordplay joke on the name of the company and a common shorthand representation of that number.

I'm surprised you didn't take offence at my suggestion that they only sell a n

I hate being a nay-sayer, but the NYT article is making quite a spectacle about this whole thing. What the group has truly done is demonstrate a novel method for placing a single phosphorus atom within silicon and proceeded to measure the semiconducting properties of the resultant device with quite good precision. Because the doping is the result of a single atom, they can resolve more than just "on" and "off", and in fact can read three states from it, so it gets its quantum computing title.

As a materials scientist, I'm worried that they don't show any long-term data and all their results appear (from my not-so-thorough reading of the originating Nature Nanotechnology report) to be based on a single device. How repeatable is this result and how consistent are the signals across multiple devices? How far will the phosphorus atom diffuse over the lifetime of the device? Or even over the first few hours of its operation at room temperature? How closely can these devices be placed to each other on the silicon chip without getting cross-interference or depriving the dopant of its discrete quantum states? The dopants in a normal device aren't too terribly close to each other. And finally, how big must the surrounding structure be?

Don't get me wrong, this is excellent science and well deserving of its publication in such a prestigious journal, but the spectacle that the NYT is creating around this and the dreams of such a tiny device is a bit premature.

The devices built in this form have been tested against temperature cycling. They have in fact traveled across continents for testing and examination. The NY times accurately reported my qualification that this cannot be mass produced (yet) and is limited to low temperatures. I see no hype in the NY Times story. I am one of the authors of the paper.

Congratulations, whether it will be used for QC or not (which is what is think, but i worked for some time on a competing kind of quantum bit), its a cool thing, and for sure it is a big step forward which will IMHO influence many devices.

As somebody having worked on QC i personally find every newspaper report mentioning QC having a certain hype. Usually they make it sound like: "This device will go into a working QC" instead of "this device enables to examine physics never examined before".

They make a transistor from multiple atoms, all of them silicon but one, which is phosphorus. That is NOT a transistor made from a single atom (as the title suggests). Great advance, in any case, but misleading title.

Exactly what I was thinking. To say the the article is lacking in detail is an understatment. All transistors have 3 parts: base, emitter, collector (or source, gate, and drain with FETs, etc). Clearly, this cannot be accomplished with ONE ATOM.

Also, transistors don't store anything (as in binary 1 for on (charged) and 0 for off (discharged)). That would be a capacitor. Transistors can be used to charge or discharge capacitors, but they do not store energy.

First, some background: Most agree that Moore’s law, which has held firm, will meet its demise in a matter of decades. This will likely signal the end of the silicon era. The basic problem is the limitation of the ultraviolet process by which a hundred million or more transistors are etched onto increasingly smaller silicon wafers. But another problem is perhaps more daunting: When computing is reduced to smaller and smaller quantum scales (currently, the chip inside your computer can be 5 or so atoms across), one runs into the Heisenberg Uncertainly Principle; it simply becomes impossible to tell exactly where an electron is, so there is leakage. In other words, using quantum computers, given contemporary materials and knowledge, 2+2 might eventually end up being 4, but there might need to be built in recursion and tautological algorithms. Computation using atoms has already been done, as pointed out by another poster. Think it will be a while before we see them at Best Buy. Also, it still seems like silicon based technology

Despite difficulty in following the overall argument given, tunneling leakage already became a significant factor several process generations ago. That was the reason for moving to high-k dielectrics: increasing the dielectric constant of the gate insulator material allows the insulator to be thicker (thus lower incidence of tunneling across the gate) for a given capacitance.

That will seek out and replace missing, damaged and/or defective areas of the brain (stroke/accident) or even gradually replace the entire brain so seamlessly that the individual in completely unaware until they can be moved into an artificial body.

Which will be the in and which the out and which the gate?Electron in, Proton out and Neutron the control? Neutron in, Proton out and Electron gate? Proton in, Electron out and Neutron the gate?Will we be able to switch them around for different applications?E-P-N for algebraic computation, for example, N-P-E for reverse polish, maybe P-E-N for secure applications (or word-processing)?And if these trans-atom transistors are installed in quantum applications, will there be E=NP problems?

I don't think it's a MOSFET - I think it's just one of the older JFETs - in this case, the phosphorus seems to suggest that it's n-channel JFET. I'd say the headline is mesleading - only the gate is a single atom, but the rest of it - the silicon - is multiple atoms. But a transistor is not just the gate - it's the gate as well as well as the source & drain. If it was a single atom, it would indeed beg the questions above that you raised. The neutron would play the same role that silicon dioxide pla

Many dream of tiny nanobots that can swim the blood stream... but as the computers on board get faster and more powerful, there will come a day when one tiny little robot will say, to another tiny little robot, something like, "Do you really care if this clown gets eaten alive by cancer? I mean, what is it to us? We're smarter than he is anyway, shouldn't he be serving us rather than the other way around?"

Then the two tiny robots switch from hunting cancer cells to hunting Purkinje fibers.

As someone who's been routinely getting "-1, Overrated" on many of my posts for about a year, I most say: Do shut up already.

In the time it takes to downmod someone, a few people have seen the opposing post, and likely agreed, or at least posted something in response that's likely to generate more interest in the original. With the high volume of traffic Slashdot gets, even 20 accounts isn't enough to obliterate any opinion to a reasonable degree. One particularly controversial post of mine managed to get every single moderation, before ending up at "+4, Interesting". I had over a dozen "flamebait", "troll", and "overrated" mods.

Mod gaming is a known problem. Slashdot's system is still above average in my opinion, and has the benefit of enough wide participation (and light enough consequences) that it doesn't matter. Sure, it's disheartening to see one of my deeply-thought-out statements misunderstood, but it's Slashdot. It's not like anything said here has a high probability of drastically changing the world.