Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

An anonymous reader writes In a paper published Thursday in Science, IBM describes its creation of a brain-like chip called TrueNorth. It has "4,096 processor cores, and it mimics one million human neurons and 256 million synapses, two of the fundamental biological building blocks that make up the human brain." What's the difference between TrueNorth and traditional processing units? Apparently, TrueNorth encodes data "as patterns of pulses". Already, TrueNorth has a proven 80% accuracy in image recognition with a power consumption efficiency rate beating traditional processing units. Don't look for brain-like chips in the open market any time soon, though. TrueNorth is part of a DARPA research effort that may or may not translate into significant changes in commercial chip architecture and function.

Neoliberal as opposed to a neoconservative? What makes the conservative philosophy superior? Nothing. One of the dons of neoconservativism is GWB. And what was it he accomplished? A twelve year, $2 trillion waste of lives and resources that was and continues to demonstrate a utter failure of foreign policy. During his tenure he created the financial atmosphere of rhinos running loose in a china store. He refused to install import levies even when he knew foreign countries were running slave labor camps and

It is getting hard to figure out where IBM is on chips. Arguably the 4 main chips experiencing investment are: x86, ARM, Z-Series processors and POWER series 2 of which are IBM. OTOH there is no roadmap for POWER beyond the current generation. I'd love to know is IBM getting more serious about CPUs or pulling back?

I agree i your initial statement, but that's pretty much as it has been for at least 15 years or so. POWER9 is on the roadmaps, and the next generation zArch too. And they are sitting there like proxy boxes with nothing much spced, like it has been for almost all previous generations of their predecessors. What I'm concerned with is the lack of public roadmap for what they are planning in the HPC and super computer space. We had the very public Blue Gene project that began in 2001 with four projects; C, L, P and Q, but since the Blue Gene/Q came to life a couple of years ago, I have no idea what they are planning.
It'd be nice to have some clue here.. Why not something from the OpenPOWER Foundation; A P8 host processor with integrated GPU from nVidia, on chip networking from Mellanox and programmable accelerators from Altera. But I haven't seen anything in that direction.

I am just curious to know whether this chip can lead to the development of artificial brains to be used by Humans in future?
And, I couldn't understand why this chip will not available in the open market.

The number of neurons in the brain varies dramatically from species to species. One estimate (published in 1988) puts the human brain at about 100 billion (10^11) neurons and 100 trillion (10^14) synapses.

100 billion divided by 1 million = 100,000 of these chips to reach the human neuron count.100 trillion divided by 256 million = 390,625 of these chips to reach human synapse count.

Assuming Moores Law for these chips with a doubling every 24 months to be conservative.2 of these on a chip in 20164 of these on a chip in 20188 of these on a chip in 202016 of these on a chip in 202232 of these on a chip in 202464 of these on a chip in 2026128 of these on a chip in 2028256 of these on a chip in 2030512 of these on a chip in 20321024 of these on a chip in 20342048 of these on a chip in 20364096 of these on a chip in 20388192 of these on a chip in 204016384 of these on a chip in 204232768 of these on a chip in 204465536 of these on a chip in 2046131072 of these on a chip in 2048262144 of these on a chip in 2050

So we could be seeing human brain capabilities on a chip by mid century. Quite possible we'd see similar capabilities built as a supercomputer 10-20 years before that. Don't flame for the wild assumptions I'm making here - i know there are a lot, this is just intended as some back of the envelope calculations.

Although we do really use most of our brain over a prolonged period, only a very small amount of it is used at any one time for cognitive activities. 10% may be a bit of a low estimate, but not by much.

Of course we know that, that's how we found these areas work: if you damage a part of the brain, something vital stops working. The problem is, most of these areas are vital to people almost all of the time.

The math: The latest intel processors use transistors that are 22nm across. The width of a hydrogen atom, ~1.1 angstrom, is about 0.11nm, or 110 picometers across. Assuming the transistor size halves every two years(which, from the looks of it, is impossible), we get this:

Here is hoping they actually keep working at it. You know what IBM are like!All those plans for Cell, all wasted. Then Power went down the drain and one of their largest buyers (Apple) ditch them because Power was lacking compared to x86, which is just holding back everything.

This is a genuinely interesting thing, possibly the best thing they have made in the longest time in fact.I couldn't see any reason DARPA wouldn't also be very interested in it, if it works as well as they say it does. Already it is

It's really hard to say how many of artificial neurons we would need to make a human-like intelligence, but it's certainly going to be less than the number of neurons in a human head. Computers already do a heck of a lot of tasks better than a human. Heck, using traditional computing methods with just a couple of these chips for image recognition and the like would already make a beast of a machine.

Computers can do logical operations better yes. Computers can't do fuzzy math, real time image recognition or real time audio recognition. Let me know when a computer can "see" with a pair of cameras. Identify an object heading toward the cpu(not just the cameras) and adjust its motors to dodge the incoming. Bugs can do that much yet computers can't.

Your brain has over a dozen different types of neurons with different functions and individual neurons themselves can have varying structures that can do more complex functions (like AND/OR/NEGATING within the same cell with different groupings of inputs)

Some brain Neurons have thousands of inputs from nearly that many nearby nerve cells and brains have overall layer patterns often with broad regional interconnects of different specific functions.

"IBM has already succeeded in building a 16-chip system with sixteen million programmable neurons and four billion programmable synapses. The next step is creating a system with one trillion synapses that requires only 4kW of energy. After that IBM has plans to build a synaptic chip system with ten billion neurons and one hundred trillion synapses that consumes only one kilowatt of power and occupies less than two liters of volume."

I think the IBM roadmap is more aggressive than Moore's law, and of course g

Each synapse contains dozens or hundreds of individual receptors that interact with the chemicals (neurotransmitters) being released to transmit the message. Certain types of receptors, called metabotropic, set off a cascade of enzymatic reactions inside the cell that represents further, highly complex, information processing. So when calculating the number of processing units in the brain, you have to go well beyond counting synapses. It's also worth noting that some of the interactions that take place can

The human brain needs all that parallelism because it's switching rate is abysmal, something like 200Hz. We ought to be able to beat that by a factor of million without setting anything on fire, so adjust your numbers accordingly.

Walking, talking sexbots would absolutely change the world, possibly eliminate most crime, might solve the problem of overpopulation, and as theorized in the manga/anime Chobits, might force real women to have to compete for male attention.

Is this pulse recognition closer to Morse code or bar code? Obviously pulse code recognition could go beyond binary bit codes but the hardware must be seriously different from what we have now. It could even be analog.

Every time one of these damn 'neural computers' come out people tend to equate the number of neurons and synapses and think 'hey, if we can get to the number of human neurons... Presto!!!!1'

Brains are waay more complicated than just neurons and synapses. Just taking the neurotransmitters into account makes the whole charade crash down. Then there is the glial network that, surprise surprise, does an enormous amount of complex work. There's even recent research suggesting that the branching patterns of the neurons perform complex computations. There are chemical gradients in the brain that act as a sort of addressing system.