With a new architecture that draws inspiration from the human brain, IBM has unveiled what it is calling a breakthrough ' ecosystem' designed for programming silicon chips. Big Blue is betting its SyNAPSE technology could pave the way for a new generation of intelligent sensor networks that mimic the brain’s perception, cognition, and action abilities.

To be sure, IBM’s new programming model is a dramatic departure from traditional software and it's a very big deal in many ways.

The SyNAPSE name is short for Systems of Neuromorphic Adaptive Plastic Scalable Electronics, and IBM reports that the project to develop it has thus far received cumulative funding of approximately $53 million.

A Whole New Approach

Indeed, Big Blue says it is breaking the mold of sequential operation that underlies today's whole generation of computers which are virtually all based on 'von Neumann' technology.

The von Neumann paradigm is named for computer pioneer and mathematician John von Neumann, and refers to a design architecture in which chips store an instruction set and a data set. In typical von Neumann style processing, the chip fetches the instructions to work with the data.

In basic terms, it's just a far more advanced type of processing that will be able to handle advanced tasks quickly. The additional speed and power are becoming increasingly important for processing big data and working in real-time with distributed systems and the Internet of Things.

Computer architectures and software programs are "closely intertwined and a new architecture necessitates a new programming paradigm,” explained Dharmendra S. Modha, Principal Investigator and Senior Manager at IBM Research. “We are working to create a FORTRAN [referring to the pioneering programming language that IBM first introduced in the late 1950s] for synaptic computing chips."

To facilitate its new programming ecosystem, IBM has developed a number of breakthroughs that it says will aid developers working from the design stage all the way through production, debugging, and deployment.

The first element of the new system is a powerful software simulator. IBM describes it as a "multi-threaded, massively parallel and highly scalable functional software simulator" that is designed to simulate a "cognitive computing architecture comprising a network of neurosynaptic cores." That's a mouthful indeed, but in the most basic terms, we're talking about a software simulator designed to operate like the human brain.

Big Blue also developed what it describes as a ", highly parameterized spiking neuron model that forms a fundamental information processing unit of brain-like computation."

The techno-jargon in IBM's announcement runs pretty deep, but for those interested in the details, IBM explains that this "spiking neuron model" can support a range of "deterministic and stochastic neural computations, codes, and behaviors." For example, a network of these neurons can "sense, remember, and act upon a variety of spatio-temporal, multi-modal environmental stimuli." Yes... just like the human brain.

IBM’s newly-developed programming model also offers a high-level description of a “program” based on composable, reusable building blocks called “corelets.” Each corelet, IBM explains, represents a complete blueprint of a network of neurosynaptic cores that specifies a base-level function. Though IBM doesn't make the comparison, these corelets sound much like objects used in object-oriented programming.

The way IBM describes it, the "inner workings of a corelet are hidden so that only its external inputs and outputs are exposed to other programmers, who can concentrate on what the corelet does rather than how it does it. Corelets can be combined to produce new corelets that are larger, more complex, or have added functionality."

Another part of the puzzle -- or in this case the 'software ecosystem' -- is a library, of sorts. IBM calls it a cognitive system store containing designs and implementations of consistent, parameterized, large-scale algorithms and applications that link massively parallel, multi-modal, spatio-temporal sensors and actuators together in real-time.

In addition, IBM has developed a teaching curriculum to help developers get up and running quickly to be able to leverage this powerful new technology. The curriculum covers the SyNAPSE architecture, neuron specification, chip simulator, programming language, application library, and prototype design models.

How Far from Reality?

According to IBM, its long-term goal is to build a chip system with 10 billion neurons and a hundred trillion synapses, while consuming merely one kilowatt of power and occupying less than two liters of volume. Systems built from these chips could bring the real-time and analysis of various types of data closer to the point of collection.

IBM offered the human eyes as an example. Eyes look through over 1 terabyte of data every day. Emulating the visual cortex, the company said, low-power, light-weight eye glasses designed to help the visually impaired could be equipped with multiple visual and auditory sensors that capture and analyze this optical flow of data.

We asked Charles King, principal analyst at Pund-IT, to help us make sense of this highly technical announcement. He told us the momentum of this innovation is remarkable -- and if IBM can continue pushing forward at this pace, there will be some very significant commercial applications in the not-to-distant future.

“When you’ve got a chip architecture that’s a radical departure from the traditional von Neumann processor that virtually every computer in the world is using today, you are building it from the bottom up,” King said.

“It’s got to include the software and development tools that go with it. To their credit, IBM understands the gravity of that point and has been working very hard on matching the kind of achievement they’ve been making in hardware with the software tools that will make this go eventually.”