Scientists claim that brain has ten-times more memory storage capacity than previously thought

Science aficionados by now must know about the ’10 percent of the brain myth’ that entails the misconception that most of us common folk only use ten percent of brain. However this time around a group of scientists at the Salk Institute for Biological Studies, have concluded that our brain actually has ten-times more memory capacity than previously thought. For their arrival at this fascinating conclusion, the researchers devised a 3D setup of a rat’s hippocampus tissue. To that end, hippocampus is one of the crucial components of the brain that plays a role in retaining information from both short-term memory and long-term memory, along with compiling data for spatial navigation.

In the aforementioned 3D reconstruction, the scientists identified that synapses (structures that allow neurons to pass signals to other neurons) were duplicated in around ten percent of the cases. Now before we delve into what that suggests, one must understand that synapses are still complex brain structures with a ‘mysterious’ aura about them (at least as studied till now). From neurological perspective, these structures are basically junctions where the chemical signals exchanged between neurons triggers our brain activity. Simply put, as the scientists made it clear – “each neuron can have thousands of these synapses with thousands of other neurons.”

Intriguingly enough, it has been found that dysfunctional synapses have the ability to cause various neurological diseases. Furthermore, they tend to vary in their size, with larger synapses being stronger – by virtue of their greater surface area and higher ‘stock’ of neurotransmitters. But till now, the variance in their sizes have not been analyzed in a more precise scope, with current sizes only being classified as – small, medium and large. However, this time around, instigated by the percentage of duplicity in the 3D reconstruction of the rat’s hippocampus tissue, the researchers decided to actually measure the differences between these (duplicate) synapses.

For such a complex endeavor, the scientists utilized both advanced microscopy and computational algorithms that they had previously developed to render the rat brain. The “connectivity, shapes, volumes and surface area of the brain tissue” were then reconstructed down to a nanomolecular level. And this is where the surprising result hit them. As Tom Bartol, a Salk staff scientist, said –

We were amazed to find that the difference in the sizes of the pairs of synapses were very small, on average, only about eight percent different in size. No one thought it would be such a small difference. This was a curveball from nature.

Now in terms of conventional neurological science, the difference between the smallest and the big synapse can pertain to a factor of 60, with most synapses tending to be small. And now, by determining that sizes only vary in increment of 8 percent (within a factor of 60), there could be a whopping 26 categories of synapse sizes – as opposed to a few. Crunching the numbers game, this means that there are ten times more discrete sizes of synapse than previously theorized. And by computer analogy, 26 categories of synapses can account for 4.7 ‘bits’ of information, which is far more than previous conjectures that hinted at just 1 or 2 bits of memory storage capacity for hippocampus.

The calculations also allude to the intriguing phenomenon that potentially entails how synapses change their size and thus ability, depending on neural transmissions. Once again reverting to the figures, around 1,500 transmissions can bring about a change in smaller synapses (taking about 20 minutes), while only two hundred of such ‘events’ can alter the larger synapses (taking about 1 to 2 minutes). As Bartol clarifies –

This means that every 2 or 20 minutes, your synapses are going up or down to the next size. The synapses are adjusting themselves according to the signals they receive.

Such an incredible ambit does hint at the superb efficiency of our brain – which is said to only generate around 20 watts of continuous power, making most of us ‘dim bulbs’. But the discovery in question here can possibly help computer scientists to contrive ultra-precise yet low energy consuming computational machines. These computers would intrinsically employ what is known as deep learning and neural networking, thus allowing for advanced levels of learning, assessment and translation. And all of these would be in mimicry of the complex workings of our own brain. As Terry Sejnowski, Salk professor and co-senior author of the paper, said –

This is a real bombshell in the field of neuroscience. We discovered the key to unlocking the design principle for how hippocampal neurons function with low energy but high computation power. Our new measurements of the brain’s memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web.