Posted
by
Soulskill
on Wednesday March 12, 2014 @01:55PM
from the marble-based-computation dept.

SpankiMonki writes "Joshua Turner, a physicist at the SLAC National Accelerator Laboratory, has proposed using the orbits of electrons around the nucleus of an atom as a new means to generate the binary states used in computing. Turner calls his idea orbital computing. Turner points to recent discoveries (including a new material that allows rapid switching of its electron states and new low-power terahertz laser technology) that could lead to the development of a computer with vastly improved performance over current technologies."

The catch is that to generate a tight enough pulse of sufficient intensity to do this, you need an accelerator two miles long. But if you manage that, you can switch electron states 10,000 times faster than transistor states can be switched.

Ray will be right eventually, but he is off on his time scales by wide margin. For one thing, in his estimates he adheres to the transistor = neuron fallacy. He then builds on this fallacy to estimate a time when the number of transistors on a chip will equal the number of neurons in the human brain. We are already at hundreds of millions on transistors on our chips!! And the human brain only has about 20 billion neurons!! We aren't that far away !!! [HEAVY BREATHING]

For one thing, in his estimates he adheres to the transistor = neuron fallacy.

To be fair, a digitally-switching transistor is almost infinitely simpler than a neuron, but you could make the argument that a transistor configured in analog mode that summed several inputs and acted as a decision maker is much closer to a neuron. The trick is getting all of those transistors working together in some sort of "analog computer" fashion, as the brain's network reconfigures itself quite a bit, which is a lot harder

Who's got the money to pony up for some experimental fab runs for billions of transistors with reconfigurable mesh network? This is basically an Intel i7 fab process we're talking about here, so think beeeeelions of dollars.

You don't need your own dedicated fab, you just need your own masks. Those will run you on the order of 100-150k per layer (and a modern CPU like the i7 has around 20 layers).

Still not cheap, but a few million vs a few billion means the difference between "not gonna happen" and "bo

To be fair, a digitally-switching transistor is almost infinitely simpler than a neuron, but you could make the argument that a transistor configured in analog mode that summed several inputs and acted as a decision maker is much closer to a neuron. The trick is getting all of those transistors working together in some sort of "analog computer" fashion, as the brain's network reconfigures itself quite a bit, which is a lot harder to achieve at billion-scale on a die.

Using human neurons as a model for the future of computing might not be the utopia that we are all dreaming of....

You're making up numbers. We've had billions of transistors on chips for some time now. The XBox One's main chip has five billion transistors. And that's just one chip. The Titan supercomputer has nearly 200 trillion transistors.

If the transistor doubling time remains about the same, you can equate any number of transistors you like to a neuron and Kurzweil's prediction still won't be off by much. Such is the nature of exponential curves. Sophisticated objections to his predictions don't involve transistor counts.

Nobody knows how much of a neuron you need to build a brain. If you actually have to simulate it, possibly at the quantum level, then no number of transistors may be sufficient. You can probably get around that problem by not using regular transistors though. Sufficient artificial neurons might actually be easier to build - noise and interference are probably not as harmful as they are in regular computing, and may actually be beneficial.

I admit you got me at first. I guess I was never a fan of people determined to turn science and technology into religions. Those topics are already cool enough as they are. Plus there are enough faith-based alternatives for that kind of thing if it feels like it's something you need in your life.

It's just a desire to have something to take the place of what the faithful crowd use some omnipotent god for. All over a tool that can do pointless drudgery work quickly and efficiently so that us humans can spend our time

Actually, it could prove to be radically different than current computers/computing. Almost all current computers are based on binary logic, your bit is either on or off. Electrons can actually have several orbital states so it is possible that computing could be approached in a different manner. This assumes that logic could actually be performed with the orbital states and it's not just a bit store. All of this is quite a long way off though, per the article you currently need a two mile long accelerator to change the orbital state of an electron this accurately.

Hmmm, I'm not so sure. Unless I'm missing something in the article the proposal does not offer anything new toward quantum computing. The advantages listed are the ability to switch electron states very quickly to improve RAM speeds and being able to read the spin of electrons - both without requiring excessive power to drive it.

I'm not sure how quantum computers compare to TMs. After some quick browsing it looks like they don't have the computational speed potential of the (only theoretical) non-determinis

You did mis-read the article. They're not proposing it as a quantum computing solution, nor are they proposing to improve RAM speeds by using electron spin. They're proposing to use the electron orbital state to store information. Currently a charge (multiple electrons) are used to store one bit. This solution would allow one single electron to store one or more bits. This could be used to produce faster storage but it has other applications as well, such as faster switching logic. The end result woul

In your first reply you mentioned that computers are based on binary logic - on or off. I thought you were getting at quantum computing where you can have a combination of the two.

From the article - "One is the discovery of a material that allows electrons to switch states really quickly that could improve magnetic random access memory speeds by a factor of thousand." So, yeah, that's essentially what I said.

If the difference is that a single electron can store on or more bits then this is definitely equiva

I read it that even if the orbital states ain't the variable, the fact that there are 8 electrons in the outermost shell enables a byte to be stored per atom. On a computational level, instead of doing binary arithmetic, one would now be doing Base 256 arithmetic, where there would be 256 states in all. Although given how well binary has worked in creating all the bases we have - Hex, Octal and others, the best would be to leave the computational base @ 2, but use the valence electrons to store an entire

I read it that even if the orbital states ain't the variable, the fact that there are 8 electrons in the outermost shell enables a byte to be stored per atom.

Wouldn't that only allow storing three bits, not eight? You can't tell which of the eight electrons are in the outermost shell, just how many there are, so the possible values are 0-8, not 0-255. Nine unique states gives you three bits plus one state left over.

Quantum computers already eschew binary thinking with the way that they manage their data, but they are still simply Turing Machines, albeit, theoretically much faster Turing Machines. But given enough time and memory, a classical computer is capable of perfectly simulating a quantum computer, and at least based on the summary, it sounds like the same would be the case here.

This may be something neat, but unless it offers something more than a new way to represent bits, it won't mean that we can solve new s

It's currently possible to store more than one bit. It's done in MLC [wikipedia.org] flash. It's not worth doing with normal logic, but it's certainly possible. There's no reason to believe storing more than one bit per orbit would be worth doing here either, but it's so theoretical, it's hard to say much of anything predictive. One you got beyond a single bit, then you have issues with sensitivity to certain thresholds. It's generally better to keep things simple. Simple is usually more reliable and faster.

Actually, it could prove to be radically different than current computers/computing

Yes, but not in the way the GP was hoping (barring a major breakthrough in mathematics/theoretical computer science)

All computers (even quantum computers) are basically the same. They are all Turing Machines. Some are just much faster than others. This machine won't be radically different, regardless of what the hardware is.

Car analogy:If existing PC's are gasoline-driven cars, the GP was hoping for an airplane. What t

So we can switch states really fast, which is excellent, but how fast is our observation? If the observation needs to be made in order to switch to the next gate then we have our bottle neck. The article was sparse on details and didn't seem to answer this question.

In theory these systems could be great. What I worry about is if they will be stable enough.

Of course, this is using orbitals, which generally are a more stable element with regards to electrons and their speedy existence.I don't think they decay spontaneously, do they?

With all these ideas, it makes me wonder what one is going to come first, this, optical computing, quantum computing, superconductive computing, ternary computing and others.I'd love to see Ternary, persona

Crysis 3 is the new king of card tests, followed by Battlefield 3/4 and Metro: Last Light. Crysis 1 sees some benchmarking still, but since it can be maxed out fairly easily now (60FPS at max settings, 1080p on a single 280X or 770) it's no longer a true system-killer.