Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

MrSeb writes "There are around 100 billion neurons in a human brain, forming up to 100 trillion synaptic interconnections. Neuroscientists believe that these synapses are the key to almost every one of your unique, identifiable features: Memories, mental disorders, and even your personality are encoded in the wiring of your brain. Understandably, neuroscientists really want to investigate these neurons and synapses to work out how they play such a vital role in our human makeup. Unfortunately, these 100 trillion connections are crammed into a two-pound bag of soggy flesh, making analysis rather hard. Starting small and working its way up, MIT today launched Eyewire, a crowdsourced 'game' that tasks users with wiring up the neurons in a mouse's retina. A future stage of the game will get users to find the synapses, too."

Infants actually have more neurons, ironically. For the most part, we start with all of our neurons and then some of them die (this is thought to have some factor in learning, where less useful neurons die and their wiring is replaced by more useful neurons).

As a former cognitive science student, I'm always amazed at how quickly the complexity of the brain limits our ability to understand it. While it's not the same as the Genome project, it's awesome when projects like this show up that prompt us to get a better understanding of the brain.

My question: can uneducated users really use the game to make valid discoveries? What prevents errors?

Also, it's a bummer that this is based on the eye, which has already had a ton of deep-dive research done.

I suspect that we will never really understand it... we might asymptotically approach an understanding of it, but I don't think we will ever get to a point where we fully do.

The reason for this is because we, ourselves, are limited in complexity of what is cognizable by the capacity and complexity of our own brains. I apologize profuesly in advance for the following oversimplification, but I imagine it would be like trying to an entire additional litre of water into another 1 litre cup that already has

Except that no one's brain has ever been "filled" up. And in any case, no one individual needs to fully understand it, just as no one individual knows every step in making a car from raw material to finished product. It's divided into multiple niches so that some individuals understand how to mine iron ore, make windshields, design new parts, assemble engines, etc. We as a species understand plenty of things no one individual understands.

Rather, I am stating that I imagine it will take no fewer than a like number of synaptic connections to understand what each group of synaptic connections in the brain does and how they actually work. By the pigeonhole principle, and especially owing to the fact that we already do know a lot of other stuff, there cannot possibly be enough room in our brains to understand how the brain actually works. We may asymptotically approach a full understanding, but I strongly suspect we will never actually have it

I get what you're trying to say, but that's not how the brain stores information from what I understand. There is a particular connection, for instance, with 5,000 nerves. It allows more than 5,000 signals to propagate through it since it is factorial. 5,000! or 5,000 factorial is a finite number, but vast beyond imagining. This is why our brains don't "fill up" like your water bottle analogy. There is no reason why our brain couldn't understand what a brain does and how it works, your bad analogies no

I see what you're saying, but I don't know if I agree. The other systems of the body (lymbic, digestive, etc.) are fairly well understood, yet we don't possess the processing power to deliberately (keyword) run them. I believe scientific analysis of many many brains may one do yield just a good understanding of the brain.

This could even more true if you believe in the Singularity, which I personally don't, but it certainly warrants mention.

Belief in "the singularity" is not required. There is no single singularity. They have already been happening many many times in human history and there will be more in the future. From stone tools, to farming, all the way to the most recent one, the PC and internet. All a singularity is, is a new technology that changes our lives in ways we cannot predict. No one could have possibly predicted the impact the PC would have on our lives. I assume that you are really talking about the artificial intellig

Which makes it perfect. Try this new approach on the eye, because you already know what results you should get. If what comes out is completly wrong, you know the method has failed. If it mostly-matches what is already known then the method is validated, and it's time to try something a little more unfamiliar.

Good point. But I don't think we know enough to say, even if the results of this do match past data, that applying this method to something more unfamiliar will yield similar quality. I guess it may bring up some interesting questions that could then be put to scientific scrutiny.

You and one of your follow-up posters suggested that the eye was basically too simple to try this technique on. This is not correct. The eye contains the retina, which is actually a part of the brain. It's a sort of small computer in the eye that, for example, calculates motion direction. Understanding how this works is cutting edge research, to which this technique has already contributed: http://www.nature.com/nature/journal/v471/n7337/full/nature09818.html

On one hand, this is a totally cool use of crowdsourcing. On the other hand, this seems like precisely the kind of task at which the computer can be orders of magnitude better than humans with the "right" algorithm.