Yes, quite expensive, I'd wager! They seemed to think that it would fill a niche market where magnetron ovens are too heavy, bulky, or delicate. Think mobile homes or airplanes. Or New York apartments:)

How do you create a CPU from 178 transistors? I'm shocked how low that number is. Is there a template for this? they said it could run MIPS. I've built a CPU out of Nand gates but it took more than 178, so I'm really intrigued.

I also got a laugh about the technique they used to find the metallic nanotubes. they over volted the circuit with the good tubes turned off. In the olden days when we wanted to debug the wiring on a wire wrap board the standard procedure was to take a fillament transformer and

Making a claim like "one day a computer will be thinner than a human hair!!! OMG it'll be great!!!" will just make you sound like an idiot sooner than you think. Lots of the quotes about computers fitting in single rooms and doing thousands of calculations are just like this.

So which 20 year old device was rendering map data of the entire world in real time, able to rotate, pitch, and pan, keeping labels upright, display in a variety of different styles to highlight different information, etc?

Today, 0.00001% of computers are used for such applications. Not much different than the 0.00000% of computers that were used as such twenty years ago, especially when we are talking about the majority of computing applications ie home computing.

I am typing this on a PPC mac that I am going to have to abandon soon, even though it does everything I need it to, because no-one supports it any more, and things are breaking due to updates.

No, it's called Google Earth (or before that, Nasa Worldwind). I had it run on a netbook with software OpenGL. It's available on cell phones too.

As for your PPC, that sucks, but if it's powerful (like a dual core G5) and if you run an increasingly recent linux distro (debian, ubuntu?) maybe you can have some improvement over time. Like being able to view more youtube videos, rendered as H264 or WebM over HTML5.

I rember reading about a SGI tech demo that did a Google Earth like thing, on high end SGI hardware with a shit ton of storage. High zoom levels were probably limited to a couple points of interest. See, it's fast Internet and big storage that enable this application, foremost. The rendering can be done with 90s tech. Earth data is many terabytes.

Certain algorithms have gotten more efficient faster than hardware/Moore's.

And many of them have not (perhaps cannot). Software bloat is rarely about choosing fundamental algorithms that are inefficient. Modern programmers have found plenty of other ways to slow software to a crawl.

Computation speed doesn't mean memory size. It's quite possible compiled and yet garbage collected languages could be made to be as computationally efficient as C while possibly having a slightly bigger RAM footprint. So, you'd still want to avoid VMs and interpreters to save up on the CPU time, but no one would worry too much about binary sizes or garbage collection.You'd still need to be precise and not leaking of course... But that just means the people writing the compiler should know what they're doing

I don't know what that has to do with what the GP said, but it's entirely possible that memristors or RTM will replace DRAM in future, both of which consume no power in idle state, while matching DRAM in bandwidth and access latency, and potentially greatly exceeding Flash and DRAM in storage density.

If that comes to pass, then the way we write software ought to change to make use of massive and essentially free LUTs to replace many computational tasks. Also ideas like separate namespaces for memory and fil

The most interesting thing about these alternative transistors might be environmental impact. I'm under the impression that traditional wafer fab is water intensive and heats and/or pollutes water. There are dangerous things such as arsenic and bromine involved. If the carbon nano-tube process is clean that'd be awesome. It would be great to think that we could dispose of obsolete technology by incinerating it, and not release anything other than CO2 into the air, leaving behind slag that's full of recyclable silver and copper.

Asbestos was a problem because it was ubiquitous. Houses were sided with it, attics were insulated with it, pipes were wrapped in it, entire skyscrapers were fireproofed with it, electric motors were covered with it, car firewalls were built of it. CPUs and RAM? Don't see an issue.

I can't remember the book I read this in, but it posited that if you remove the silicon part of Moore's Law and you just talk about computing power and cost and the like that you can make a case that it has been in place throughout human history. In other words computing power has always been doubling, it just started by drawing numbers in the dirt, went to the abacus, etc.. etc... until we reached the silicon age and integrated circuits.

The hand wringing that the idea behind Moore's Law will ever end is just silly. When we reach the limits of silicon chips some other technology will take its place. This is just how human technology works.

So in 1971, we could do 740,000 additions in a second, given that your new law asserts doubling of computational power every 18 months, that implies that that in jesus' time it took them 3.5e386 *days* to do one addition. Something tells me this is bullshit:P

The OP is talking about what Kurzwiel says in 'The Singularity is Near' - that calculations/second/$1000 has been growing exponentially [wikipedia.org] independently of Moore's law (and by law I mean off the cuff observation). The 'per $1000' is the important part you're missing.

The hand wringing that the idea behind Moore's Law will ever end is just silly. When we reach the limits of silicon chips some other technology will take its place. This is just how human technology works.

That sounds a lot like The Age of Spiritual Machines [wikipedia.org] by Ray Kurzweil [wikipedia.org], a well-known proponent of the singularity and graphs like this one [wikipedia.org] which charts exponential change going back to the beginning of life on Earth. Ray Kurzweil tends to come off as absurdly over-optimistic, but I do agree that the end of silicon in 2020 is unlikely to be the end of Moore's Law. On the other hand, unbounded exponential growth doesn't happen in the real world [ucsd.edu]; Moore's Law will certainly stop at some point, it just isn't clear

The hand wringing that the idea behind Moore's Law will ever end is just silly.

There are two physical constraints Moore's law can't get around. For a volume of a fixed surface area, you are limited to how much information you can pack in there before a black hole forms. Second, a change of state, say flip a bit of memory, generates at a certain amount of heat. As a result, Moore's law will end.

1. "Lab-grown circuits that are thousands of times thinner than a human hair" is exactly what one could use to describe current silicon circuits. In fact, this study made transistors that are a micron across (which is, at best, hundreds of times thinner than a human hair), compared to current state-of-the-art silicon which is in the 22-28 nm range.2. "A fraction of the energy required" does not describe the current study, nor was it their intent, from what I understand about the researchers' claims.

That's not to say that the research isn't very valuable; it looks like the level of integration they've managed is significantly better than what anybody else has achieved. But at the same time, there are lots of other ways that you could build a circuit that uses more area, costs more, takes longer to build, and is less power-efficient - this is just one more. All they've demonstrated is that you can hook together more than a handful of transistors successfully - but nowhere near the billions that they'd need for a commercial product.

The real breakthroughs have yet to be made; making it cheaper, smaller, faster, more efficient, and easily manufacturable - all at the same time. Not until all those problems are solved will it even have a chance of replacing real silicon. Until then, this is yet another case of a university PR rep boasting about their institution's research with grand claims about what the future holds, while not really reflecting the true nature of the research at hand.

As a field, we need to stop the hyperbole. It's embarrassing. They're doing a nice job of integration, but to claim any kind of fundamental advancement is absurd and irresponsible.

As an industrial scientist, this kind of misleading stuff makes my job significantly harder. Your typical non-expert doesn't realize that these guys did not achieve the aims claimed in the press release and are no where near to achieving them. If I do want to make meaningful advancements in manufacturability or performance, I first have to teach investors and business partners that the academics in my field are all lying to the public... not a good starting point.

That is by far the most incredible scanning electron microscopy image I have ever seen! The colors are so vibrant! And what function do the column of nanoscale binary numbers on the left hand side do? Are they thirty-two 10-digit numbers or ten 32-digit numbers? Now hit "Enlarge" and BLOW YOUR MIND. Those white lines in the center of the colored areas are actually dots. WHAT DOES IT MEAN??
But seriously, when the image is enlarged you can actually see some of the very tiny edge imperfections between

A fellow I knew about ten years ago was wacko conspiracy theorist, "I've seen the mothership" UFO believer, who also was an incredible analog and digital electronics wizard. He told me that the NSA was already using carbon-based semiconductors running at much higher clock speeds for its various nefarious operations. This makes me wonder if carbon nanotube technology hadn't already been developed and implemented in "skunkworks" world and it's only now that it's being developed in universities. It's sort o

If nanoscale tech is available to universities now, what have organisations like DARPA been doing with it?

Probably giving out the grants to academia to research it. DARPA conduct no research themselves.

How do we ensure that the rights of all intelligences are protected from exploitation?

Take this shit back to LessWrong, where you and your fellow pseudointellectuals can circlejerk about ivory tower garbage like this. Or at the very least, try to keep it out of meaningful articles like this one, which represent an actual advancement; it doesn't need to be polluted with Raymond Kurzweil level crap.

See, it's that sort of waste-of-time nonsense that ensures that no one will ever take you seriously. You might as well be discussing how many angels can dance on the head of a pin. Strong AI is decades off and by the very definition the singularity is something that you can't see past because the rules break down. As for a robot apocalypse because some AI gets it's feelings hurt, recognize that real life is not a movie. There is no computer system that can set off the entire nuclear arsenal all on it's own.

Take this shit back to LessWrong, where you and your fellow pseudointellectuals can circlejerk about ivory tower garbage like this. Or at the very least, try to keep it out of meaningful articles like this one, which represent an actual advancement; it doesn't need to be polluted with Raymond Kurzweil level crap.

Ever wonder why the terminators hate us? Why the machines of the matrix enslave the humans? Why the planet of the apes lobotomizes you? It's human chauvinism, plain and simple. There's nothing special about intelligence, and at this rate of increase in complexity it's unconscionable to have an outdated definition of "person". If it's "pseudointellectual" to own up to the fact that the equivalent of your mere 100 billion neurons will soon fit on a microchip thanks to such technology, in the TFA then it