Open-source NEST software Powers Neural Simulation on K Supercomputer

Researchers have used a Japanese supercomputer to carry out what they describe as the largest general neuronal network simulation ever.

The simulation, by the Riken HPCI Program for Computational Life Sciences, the Okinawa Institute of Technology Graduate University (OIST) in Japan, and Forschungszentrum Jülich in Germany, was made possible by the development of advanced novel data structures for the simulation software, Nest.

The relevance of the achievement for neuroscience lies in the fact that Nest is open-source software freely available to every scientist in the world. The team used the K supercomputer, to carry out the simulation.

The team, led by Markus Diesmann in collaboration with Abigail Morrison, both now with the Institute of Neuroscience and Medicine at Jülich, succeeded in simulating a network consisting of 1.73 billion nerve cells connected by 10.4 trillion synapses. The program recruited 82,944 processors of the K computer. The process took 40 minutes to complete the simulation of a single second of neuronal network activity in real, biological, time.

Although the simulated network is huge, it only represents one per cent of the neuronal network in the brain. The nerve cells were randomly connected and the simulation itself was not supposed to provide new insight into the brain – the purpose of the initiative was to test the limits of the simulation technology developed in the project and the capabilities of K. In the process, the researchers gathered experience that will guide them in the construction of novel simulation software.

The researchers say the achievement gives neuroscientists a glimpse of what will be possible in the future, with the emergence of exascale computers.

If petascale computers like the K computer are capable of representing one per cent of the network of a human brain today, then we know that simulating the whole brain at the level of the individual nerve cell and its synapses will be possible with exascale computers, hopefully available within the next decade,” said Diesmann.

Resource Links:

Latest Video

Industry Perspectives

"Exascale computers are going to deliver only one or two per cent of their theoretical peak performance when they run real applications; and both the people paying for, and the people using, such machines need to have realistic expectations about just how low a percentage of the peak performance they will obtain." [Read More...]