Our brain could become computer memory.

Researchers experiment with living neurons to try and build a simple biological computer.

Biological/organic computing has long been the stuffs of science fiction. But as we have all come to see, what is science fiction one day, does become science fact the next.

Researchers have made a small breakthrough in the terms of biological computing. Specifically, the possibility of human and animal neurons have great potential in terms of computing speed. Neurons perform some very impressive tasks in terms of visual sensory input and pattern recognition, and all in real time (or pretty darn quick). If some form of neurons could supplement a conventional computer system, the power of that system could be increased tens, if not hundreds or thousands fold. Another benefit is the small size, three dimensional configuration, and parallel computing aspects that living neurons have.

A first of its kind, this recent breakthrough could be the first small step towards biological computing. The scientific journal Physical Review E has just published a paper (read it here) that highlights some of the new possibilities of using neurons for biological computing.

Researchers involved in this experiment needed to be absolutely certain that living neurons would function in a consistent and predictable manner. By using a small petri dish studded with an array of electrodes, they were able to track the firing patterns of neuron samples when invoked to fire a signal. And as many had found before, neurons that are grouped together will fire in a synchronized bursting event (SBE). When one nerve fires a signal, it would trigger those around it, sending that signal through the network of nerves. A very natural reaction for organic nerves, and a very predictable one for any given type of nervous system.

Unfortunately, nerve cells do not generate these SBE firings with any consistent pattern when left on their own. Researchers needed to train independent nerve cells to act in a more predictable fashion. The first step was to try and invoke some type of memory in the nerve cells by treating them with calcium. Calcium will cause nerves and cells to react and or fire. Their subsequent tests with a calcium solution did not yield any promising results, as the nerves would revert to their random state after the calcium reagent was removed.

The researchers theorized that inhibitory neurons were blocking the learning process from their neighbor nerve cells. So, they introduced a chemical that inhibited the functions of the inhibitory nerve cell, allowing the other cells to learn. By repeating this step (introduction of the inhibitory chemical) every twenty seconds, they were successfully able to teach the cells several different SBE firing patterns. And by dosing different collections of cells, they created different SBE groups without affecting any of the other cell clusters. Simply put, they could now layer different memories onto a single collections of cells. A biological Quad Core processor in a sense; multiple threads across the same system.

It may not sound like much, but it is the first step. We won't have living computers anytime soon as more elaborate functions must be taught to neurons and cells to even begin to perform the most basic of fencings. And odd as it may seem, the majority of this research simply relied upon the natural connections that neurons made when placed in a culture. Just like a dog, these neurons want to learn (on a very basic level), all we have to do is teach them.

yeah, how science could eventually download all of our thoughts and memories into a computer. he was telling this to some woman. that woman was really excited about the possibility of someday having all of her thoughts and memories downloaded into a computer. ross thought this was weird and walked away (dofus, i remember her being pretty cute)

hmmm. very interesting. I would give it more like 60 years, though. As far as cells surviving, lolz. I am not sure of what a nueron needs but, I figure it would be very easy, perhaps a protien saturation liqiud or a speciel fertalizer, for lack of a better word. Aa far as I know, though, downloading someones thoughts into a computer is a bad idea. becase then, they have no thoughts. :-S What they are talking about is more along the lines of the computer reading your thoughts and making copys. ALso, think of the memory amounts a computer like that could have! The human brain is only filled about 10% with memory and other fucntion like that...damn! We basicly have the ability to remember several thousand terabytes of material! Because when you think about it, we have 12 years of school, then the rest of our life, remembeing most everything we see. thousands of terabytes of information( in computer terms).

I was just thinking about if it would be possible to use brains (not human, but probably any animal brain) as storage. With all the memory we have (like remembering pictures, video, sound, ect), you could have a ton of storage (10,000 terabyte?). This is pretty cool.

To get 10,000 terybites you'd need human brains, no animals have the same abilities as we have, they have no higher thouhgt processes, they have no true "Memory". So in the end, if we use anything but a human brain (cloned most likely off of a dead person) we would only receive maybe 1,000 to 3,000 terabytes of memory out of it, it is truly complicated. lolz.

Hmmmm...I wonder how this would work with a brain from someone who has a learning disability (Down Syndrome, Autism, etc.) and/or a brain injury such as a concussion, or could this possibly rectify some of these problems? Still, this sounds pretty complicated--I'll leave this one to the science experts (and that wouldn't be me I might add).

Hmmm. Those people could potentialy have MORE memory. Disablilities like that do not impare the brain. per se. They simply hinder it from working at 100%. People like that have huge abilities to remember places, events, faces, people, even number and words. But in the end it all boils down to HOW you implament this technology. It may never come, it may come hundreds of years from now, or next year. Mort likely not next year, however. It just depends..lots and lots of variables.

also you wouldnt have to worry about viruses but lets say someone accedently ncks over the computer it would automaticcaly die.but what if u could plug in yourself and you would be the computer that would be somethingbut really painfull