Posted
by
samzenpus
on Wednesday October 22, 2008 @05:57PM
from the I-miss-the-original-three dept.

rennerik writes "Scientists at McGill University in Montreal say they've discovered a new state of matter that could help extend Moore's Law and allow for the fabrication of more tightly packed transistors, or a new kind of transistor altogether. The researchers call the new state of matter 'a quasi-three-dimensional electron crystal.' It was discovered using a device cooled to a temperature about 100 times colder than intergalactic space, following the application of the most powerful continuous magnetic field on Earth."

Moore's law is about manufacturing on siliconIf it isn't silicon, then it isn't Moore's law.remember kids, increasing processor speed is a by product of Moore's law/ Moore's law is about cost of manufacturing.

100 times colder than 0 K? So, that's what, 0 K? Why not make it 1000 times colder?

(Yes I know space is slightly warmer than absolute zero, but it's still a really weird claim to make - we are only talking about a couple of degrees here)

Also, am I the only one who, upon hearing "discovered a new state of matter", doesn't immediately think "Sweet, we can extends Moore's Law!", but rather "Holy shit, a new state of matter?" Seems like a pretty big discovering on its own, even without being tied to chip manufacturing...

On a nerd side note. We all know Dilithium in reality is a gas. But at the temperatures stated in the article. Would it be able to form a solid? Likely it would NOT be a crystal but it'd be fun to know.

The laws of Thermodynamics state that we can't really achieve absolute zero [wikipedia.org] As far as the far reaches of space goes they may be referring to the boomerang nebula which is the coldest place we know of so far - outside of the laboratory. I wish the article had been more specific and quantitative. FYI a really good program to watch if you get a chance is Absolute Zero [pbs.org]

Really, it's about cost.And the paper Moores's law comes from is about economics, so no changing to processing is still incorrect.However, it does neatly deal with multi-cores.

The Fab costs, at this time, for the next round of doubling the transistors is pretty huge.Whe they ahve to toss 4 out of 5 wafers, the cost to the consumer may become prohibitive. No doubt large orginization will continue upgrading.

From what I've been reading and talking to eopel in the fab industry, we will reach a state where system much 'faster' my e using tools and techniques that are hugely expensive to operate.Such as needing super cooled room and a large magnet.

So large organization will have computing power that outstrips the home PC.

I would ahve said the 'Average Joe's' PC, but that guy has turned out to be a liar and a stooge.

Since there are already numerous posts invoking the applicability (or not) of Moore's Law, I thought I would start over. Although Gordon Moore certainly formulated his law based on silicon (original is here: http://www.intel.com/technology/mooreslaw/ [intel.com].) it can be applied clear back to 1890 with the Hollerith 'computer' that tabulated the 1890 census. When you graph it out, Moore's Law applies to electro-mechanical switches, then to relays, then to vacuum tubes, then transistors themselves (like in a six transistor radio of the 50's), then on to silicon. It's still the same exponential curve, in five separate states, only the last one of which is silicon. Kurzweil discusses this in depth here: http://www.kurzweilai.net/articles/art0134.html?printable=1 [kurzweilai.net]. People who claim Moore's Law doesn't apply because this isn't traditional silicon acreage are missing the point, which is that not only is Moore's Law more encompassing than the originally envisioned, it is not going away any time soon. The imminent death of Moore's Law, as always, has been greatly exaggerated.

The thing that intel is 'best' at is providing massive amounts of processing power at low price points, and increasing that power over time (they also have chipped away at consumption of watts, but that has largely been a side effect of their performance/$ obsession).

The number of people that have tasks that are best run on a single core and are not fast enough is getting smaller and smaller. That means that the benefits of making a single core faster are smaller and smaller (and thus the amount of money available for faster and faster cores is also smaller).

Combine those two concepts and you end up realizing that intel pretty much competed themselves into a situation where all they face are decreasing margins. If the fab costs really are going up, then they face huge margin pressure (because fewer and fewer people are interested in paying more and more to upgrade machines that are fast enough).

Perfect vacuum is theoretically impossible due to quantum mechanics (I can not explain why, but that makes sense).

For any given particle, you can't know its exact position and velocity. Particles can never reach absolute zero because then you would be able to determine their position since you know their velocity would thus be zero given they have no energy by definition of absolute zero. An extension of that then is if you know a particle's velocity you will never be able to determine its position. If you can't determine its position you can't determine whether it is really outside a vacuum. You may be able to say it isn't in the middle of the volume which represents the vaccum but at the boundary you can't say for sure whether the particle is on the inside of the vacuum or outside. This is Heisenberg's Uncertainty Principle. An absolute zero temperature vacuum is definitely impossible due to the uncertainty principle.

Intergalactic space is not at 2.7 K. Especially in galaxy clusters, the temperature of the intergalactic medium is often millions of degrees Kelvin. Even in more remote places far from galaxy clusters, it's still much warmer than 2.7 K. The 2.7 K figure is the temperature associated with the cosmic microwave background radiation, not the intergalactic medium.

I'll bet extra-galactic space (beyond the furthest galaxies, beyond the fastest moving elements from the big bang) is colder, possibly even a vacuum if you go far out enough. (Actually, I was just there the other day, it was nice because there was no pet hair or dust bunnies.)
Isn't it kind of mind-blowing that according to big bang theory everything is contained within nothing?

which is in part a fallacy, because of the low density of particles heat is very very difficult to get rid of. yes, decompression will act as a high speed coolant, but only for a fraction of a second, and only for sealed pressurized containers. a large solar panel getting hit by the sun would literally melt in hours, from all the heat building up in the silicon. this is why satellites have special coolant systems designed to operate in the vacuum of space. there was a special tool designed to measure the temperature of gas that surrounds the earth at the level that satellites travel at, and the temperatures are very high, because the heat has nowhere to go. a vacuum is the perfect insulation material, as is evidenced by thermos.

yes, heat does radiate, but is dependent on temperature, and is only so efficiency, it was a major breakthrough getting solar panels to work in space and not melt down.

of course my entire argument only applies near by stars. anywhere in deep space, away from stars space is very very cold, because the amount of heat from stars is reduced along and exponential level based on distance.

What implications does this have for the big bang? I assume that before the big bang, space was colder, thusly opening the door for for creation of this type of matter before the universe heated. Does this have implications beyond computing?