>> It seems that a large portion of the savings come from being application specific and using analog computation.

Yes it is analog computation but a different beast of analog computation. You are looking at weak inversion biasing of transistors which gives you low power but you have to deal with lots of noise issues in your design.

>> But since Google has already simulated a 10 billion simplified neuron system on only 16 GPU's and used it fr computer vision one has to wonder whether simplified neurons will do

Google did that using FPGA and massive computing power which adds not a lot of value when compared to the real brain. We are not talking of having a datacenter to power your brain. We want something less power hungry and efficient that is close to the natural one. That is the innovation in Kwabena's work.

>> 100,000 times more energy efficient, amazing...congrats Kwabena! maybe it is time for you to give a talk at emerging technologies conference

Kwabena is such a big fella to get invitation this way. He is truly leading the neuromorphic nexus. The path to human immortality could start this small as after cloning the brain in a circuit, the next phase will be making it to replace the one we have in case one needs a better one.

The relatively low power consumption is not quite as incredible as it sounds. It seems that a large portion of the savings come from being application specific and using analog computation.

That is not meant to diminish the achievement. There seems to be significant potential use for analog (and other approximate) computation and special purpose design. Not only could the Neurogrid device be useful in studying biological systems, but the project's work on hardware and software design might help advance use of similar systems for other areas.

This is definetly asn interesting project, and it will surely be useful for learning more about the brain.

But since Google has already simulated a 10 billion simplified neuron system on only 16 GPU's and used it fr computer vision one has to wonder whether simplified neurons will do , or we really need complex neurons for engineering applications ?

In conjunction with unveiling of EE Times’ Silicon 60 list, journalist & Silicon 60 researcher Peter Clarke hosts a conversation on startups in the electronics industry. One of Silicon Valley's great contributions to the world has been the demonstration of how the application of entrepreneurship and venture capital to electronics and semiconductor hardware can create wealth with developments in semiconductors, displays, design automation, MEMS and across the breadth of hardware developments. But in recent years concerns have been raised that traditional venture capital has turned its back on hardware-related startups in favor of software and Internet applications and services. Panelists from incubators join Peter Clarke in debate.