It's been a recurrent theme in science fiction. Now, IBM researchers have unveiled a new generation of experimental computer chips designed to emulate the brain's abilities for perception, action and cognition.

It's believed the technology, which was developed using non-traditional methods of designing and building computers, could consume many orders of magnitude less power and space than computers now on the market.

The neurosynaptic computing chips mimic spiking neurons and synapses in biological systems by using advanced algorithms and silicon circuitry.

IBM engineers and analysts have already constructed the first two prototypes, which are undergoing testing.

Cognitive computers, as they are slated to be called, won't be programmed the way traditional computers are. The systems are expected to learn through experiences, find correlations, create hypotheses, and remember - and learn from - the outcomes, mimicking the brain's structural and synaptic plasticity.

To achieve this, the technology company is combining principles from nanoscience, neuroscience and supercomputing as part of a multi-year cognitive computing device.

"This is a major initiative to move beyond the von Neumann paradigm that has been ruling computer architecture for more than half a century," said Dharmendra Modha, project leader for IBM Research.

Modha went on to say, "Future applications of computing will increasingly demand functionality that is not efficiently delivered by the traditional architecture. These chips are another significant step in the evolution of computers from calculators to learning systems, signaling the beginning of a new generation of computers and their applications in business, science and government."

The cognitive computing prototype chips use digital silicon circuits inspired by neurobiology to make up what is referred to as a "neurosynaptic core" with integrated memory (replicated synapses), computation (replicated neurons) and communication (replicated axons).

It is the hope of those close to this research that the chips in the future will be able to consume information from complex, real-world environments through multiple sensory modes and act through multiple motor modes in a coordinated, context-dependent manner.

"Imagine traffic lights that can integrate sights, sounds and smells and flag unsafe intersections before disaster happens or imagine cognitive co-processors that turn servers, laptops, tablets, and phones into machines that can interact better with their environments," said Modha.

Researchers believe the technology could be used in many ways. For instance, a stockperson at a grocery store wearing an instrumented glove that monitors sights, smells, texture and temperature could identify bad or contaminated produce.