IBM announces neurosynaptic software system

IBM's new software ecosystem, designed to back up its TrueNorth neurosynaptic processing system, promises a new class of computing machine.

IBM has announced the development of a new software ecosystem which, it claims, will lead to computer systems that operate in a way similar to that of the human brain.

Following on from the company's previous work in the area of neurosynaptic computing - computer architectures which draw their inspiration from nature, modelling their design and operation on the brain - including the creation of a neurosynaptic processor dubbed TrueNorth in 2011, the company's announcement promises to break the mold of the von Neumann architecture.

Named for computer scientist John von Neumann following the publication of his paper First Draft of a Report on the EDVAC, the von Neumann - or Princeton - architecture was for years the model for computing systems, describing as it did a stored-program system with RAM, storage and input/output devices all controlled by a central processing unit under a central bus. Although all-but replaced by more flexible designs - such as the Harvard architecture, which allows for separate buses to eliminate bottlenecks - it's central tenets can be found in almost all modern computer systems from microcontrollers through to your desktop or laptop.

Where most modern architectures build on the von Neumann and Harvard models, however, IBM is planning a different path: a brain-inspired model for a new generation of highly-interconnected, asynchronous, massively parallel and large-scale computing systems based around the concept of cognitive computing.

'Architectures and programs are closely intertwined and a new architecture necessitates a new programming paradigm,' claimed Dharmendra Modha, principal investigator at IBM Research, at the unveling. 'We are working to create a FORTRAN for synaptic computing chips. While complementing today's computers, this will bring forth a fundamentally new technological capability in terms of programming and applying emerging learning systems.'

The company claims that systems programmed using its new synaptic software ecosystem and running on chips similar to that announced in 2011 could have significant benefits over traditional systems in a number of fields. Its overall goal, the company has explained, is to develop a chip system boasting ten billion 'neurons' and one hundred trillion 'synapses' - making it roughly one-ninth as complex as the average human brain, which contains around 86 billion neurons - while drawing one kilowatt of power and taking up an overall volume under two litres.

Such a system, says IBM, would be far better suited to capture and analysis of sensor data than a traditional computer system, and able to work with ever-shifting analogue data that a typical binary-based computer system is ill-equipped to handle. IBM even claims it could created electronic 'eyes' which emulate the visual cortex, giving visually impaired people low-power and light-weight neurosynaptic glasses that could sift through terabytes of sensor data, crunching it down into useful snippets: the number of people in a room, distance to an upcoming obstacle, number of cars at a crossing, and full navigation capabilities.

Other concept designs developed by the team at IBM Research include a spherical, autonomous robot dubbed Tumbleweed for search and rescue missions in disaster zones, a home thermometer with embedded camera and extra sensors capable of alerting doctors to potential health issues, jellyfish-inspired marine sensor buoys, and - oddly - the Conversation Flower, which opens its petals when the conversation becomes vibrant and animated while also working to create an automated transcript of the discussion.

The technology is, naturally, some way away from such utopian visions yet - but with the software infrastructure in place, it's a lot closer than ever before. The company is to formally present its findings, which represent Phase 3 of its DARPA-funded project, at the International Joint Conference on Neural Networks in Dallas this week - a teaser video for which has been released, and is reproduced below.

Share This News Story

7 Comments

Sounds like one of Thinking Machines 'Connection Machine' supercomputers compressed down to a chip. I wonder if their chip has any significant advantage to simulating a SLNN using RAM and a modern fast sequential processor iterating over the net (other than the operations happening truly in parallel, which is irrelevant unless the entire system is asynchronous).

Originally Posted by edziebaSounds like one of Thinking Machines 'Connection Machine' supercomputers compressed down to a chip. I wonder if their chip has any significant advantage to simulating a SLNN using RAM and a modern fast sequential processor iterating over the net (other than the operations happening truly in parallel, which is irrelevant unless the entire system is asynchronous).

I think they're using a LOT more very simple cores/cells that the Connection Machine. Still, the connections become the biggest part of the chip when the number of cells goes up, particularly with a lot of connections between cells. Guessing they would be using some sort of multiplexing buses with local, on-chip and between-chip sets.

Originally Posted by GradiusThey keep saying it "mimics the brain" and stuff like that, but in reality the nature did a way such it will never be possible!

Nonsense! Brute force method: molecular simulation of the volume of neural tissue you want. More elegant solution: parametric neurons with diffuse neurotransmitter simulation. Most SLNNs for the last few decades are just parametric models.

That is, unless you subscribe to Dualism and think the brain has some sort of magic intangible goop that lets you think.