Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

An anonymous reader writes "Carl Zimmer's latest foray into neuroscience examines why the brain can get jammed up by a simple math problem: 'Its trillions of connections let it carry out all sorts of sophisticated computations in very little time. You can scan a crowded lobby and pick out a familiar face in a fraction of a second, a task that pushes even today's best computers to their limit. Yet multiplying 357 by 289, a task that demands a puny amount of processing, leaves most of us struggling.' Some scientists think mental tasks can get stuck in bottlenecks because everything has to go through a certain neural network they call 'the router.'"

Like Sleep? I can't count the number of times I've been stuck with programming logic, math word problems, etc. I'll stare at it until I can't make any more sense of it, go to bed. Wake up and within 30 seconds have the solution.

It doesn't that much like an FPGA.I can give a rough estimate of the multiplication answer quite quickly. If I keep needing similar or better estimates, after lots of practice I'd get better at doing it assuming appropriate feedback and training.

I haven't seen an FPGA adjust itself when you tell it "bad boy, that's not what I want".

As for picking out a familiar face so quickly. That's because there's a neuron or more in your brain that do the equivalent of yelling "Bingo!" every time you see or think of tha

Nothing says that you can't have a separate route for getting approximate answers quickly in parallel with exact answers slowly. Indeed 'emotional' reasoning seems to be an example of the former: "That's not fair!" vs "That leaves me 6.8% out of pocket!"

To me, if a thought process isn't innate, then I must consciously traverse it, step-wise. I suppose processes that I haven't been rigorously taught may consist of a set of steps which are only somewhat dependent on order, and I stumble my way through fulfilling requirements as necessary. Either way, I must consciously make my way from one part of a sequence to another.

This involves calling up the next step, which can be fast if the steps have been rigorously learned, or slower if I must analyze the proces

Couldn't it just be that we do not really have direct access to the raw computational capacity of the brain? There are savants and people who have trained themselves tremendously who can do arithmetric like this, using memory tricks and such. Wouldn't that be more like a hack to "reach down" to utilize the low-level capacity of the brain? The brain is nothing like a man-made computer, but doesn't the "layers of abstraction" still apply? The brain can calculate 357 by 289, but it does not naturally "understand" what 357 or 289 is, or for that matter what the high-level instruction from "me" to "multiply" is.

I don't think it's processing power or inability at all. I thnk it's lack of working memory. We can all work out 357 multiplied by 289 easily with pencil and paper. Very easily. And we could do it in our heads just as well if we could casually remember all the intermediary stages: e.g. 9 times 7 is 63, 9 times 50 is 450, 9 times 300 is 2,700, sum all three numbers and remember the result, now begin with 80 times... etc. But it's not easy for most people to do that. The computation is easy. But we need more registers.

Indeed, and I suspect that they've learned to use the rules of math as well. I'm far better than most and it often times takes less time for me to do it in my head than it does to pull out the calculator, even if it is right in front of me. The trick is to use the properties to make things simpler.

For instance 12*17=204 it's also 12*12 + 12*5=204. Most people can without much trouble do the latter, but the former is much more difficult for folks to do. Mathematically the result is the same.

I think you may actually be on to something here. Because intellectually, i know how to multiply those two numbers, but in practice, without paper, i'm going to drop digits at some stage and foul it up. I can do it perfectly on paper, and if i devote a large amount of my focus to hand waving and writing numbers in the air, I *may* be able to crank through it without paper, but there is still a decent chance of a computational error. Its like I need to output the results somewhere besides my short term memor

I'm pretty sure you've hit the nail on the head. The only problem - this isn't new and publishable like a "router in your brain". Miller's Magical Number Seven (Plus or Minus Two) [wikipedia.org] was published way back in 1956. It's easy to see how it applies to a multiple-step calculation like this.

It's mostly just evolution. Being able to recognize a friend (or enemy for that matter) in a busy environment is much more valuable than being able to do three digit multiplication from an evolutionary perspective. If we train people for it they're a lot better than the average person, but there's not much of a need for the ability to do it manually since there are calculators. If nature had selected on doing multiple digit math calculations instead of escaping predators and forming communities to better su

Couldn't it just be that we do not really have direct access to the raw computational capacity of the brain?

Probably. You can scan a crowd because you have a hardware-level implementation for that; you can't multiply efficiently because that has to go through multiple levels of emulation, at least one of which has a severe lack of reliable memory.

We shouldn't forget that abstract thought is actually a very new evolutionary hack; we've only had a real culture for a 10,000 years or so. Before that, it was cave paintings for a 100,000 years. You can't expect a very experimental feature to be thoroughly optimized, yet.

We also haven't been worried so much about exact numbers of things for much of that time, and matching faces against memories isn't that exact of an example.

You're likely to recognize someone who grew a mustache or cut their hair, or to ask someone familiar to you where they got a fresh scar rather than walking right past them.

You are also not likely to care exactly how many bushels of barley you raised until you start selling the grain for currency or protecting it from known thieves. So long as your granary doesn't run out before the next harvest, you have enough grain. Even when bartering or selling for currency, unless you do a lot of it you can estimate your reserves of unsold stock. Once you move to a mercantile economy rather than being your own producer of sustenance, though, knowing how much of something you have and what you can get in exchange becomes more important.

Building things takes a similar route to economics. If you're building small houses with a central hearth, the construction skills are much more important than anything numeric. Once you're building grand temples and fortifications, engineering kicks in.

Now for the car analogy. I'll hit both engineering and economics. Once you have the materials and power sources to make automobiles and airplanes, engineering and trial-and-error still play a role. If you build custom buggies or roadsters on the weekends, you can utilize hard engineering but you probably don't need to. If you're meeting specific crash safety, fuel economy, and profit margin goals for the design of a car model and its highly automated production process for a big mass-market car manufacturer, your numbers had better be right.

Do we have any idea how the brain goes about calculating stuff, and at what precision?

I would be surprised if it has a general purpose math unit. It's more likely that there are some operations that has to be done, and it can do those very fast - e.g. sin(x)*y - but it can't suddenly switch to using another formula without major rewiring... and it might even be using table lookups instead of proper math.

I don't recall a proper citation, but I seem to remember that even identifying quantities at a glance goes something like "none, one, two, three, four, five or six, some, a dozen, a score, a few score, oh my that's a lot". The specific levels at which those change over can vary, of course. Some people probably would say "about ten" before they'd say "about a dozen", too.

One thing I've always liked about the Imperial measurement system, in fact, is that although the math is a little harder the units and thei

As European the Imperial system is pretty Greek to me. For length we do indeed use those "unwieldy" meters/centimeters, but for the few measurements (like a person's height) where it doesn't quite fit we use fractions of a meter. Before today I had never rely thought about it so I don't think it's much of a problem.

My exposure to imperial units is entirely through movies. I honestly can't say how far 5 feet are, or how heavy 5 stones are. I can guess that five feet is 1.2 meters and that five stones is ha

What system did you grow up using? I'm in the us. I can't imagine what a 20 degree c. Day feels like. I don't have a good idea how fast 42 kph is. But If you tell me you have 2 liters of something, I have no trouble envisioning it.

Try being a Canadian. We're caught between you guys and the rest of the world. So while my drivers license has my height in metres and my weight in kilograms, I honestly can't think of anyone (myself included) who uses those units in real life. When the newscasts give reports on a person of interest, it's always given in feet and pounds, because most people have no clue what a 1.75m, 80kg man looks like (but they can quite quickly imagine someone 5 foot 9, 176lbs). Yet small measurements of weight (for example, at any grocery store i've ever seen in Canada) are typically in grams or kilograms. Speed and distance are usually given in kilometres (/per hour), but older and/or rural folk still use miles because the entire township/rangeroad grid is still based on miles. So you have to know that driving 6 miles down the road is going to read as 10km on your odometer. But go to the drag strip and trap speeds are all given in mph. Volume is usually in litres, but due to the US being our largest trading partner, many industries still use gallons too (especially in bulk). When I worked for an oil distributor this was always something we had to watch out for, because our holding tanks were marked in litres, but everything we ordered from the US came in gallons. It was an important concept to understand when trying to calculate how many 20,000 gallon rail-cars of oil were needed to fill three 50,000 litre storage tanks. Oh and temperatures are mostly in celcius, but a good portion of the population (especially older people) have something of a working knowledge of fahrenheit. Typically, people know room temperature is about 72 (~23C) and that anything over 100 is "damn hot" (38C), usually from/for travel. Interestingly, one of the places this all gets REALLY frustrating is in cooking. While I just stated that temperature is usually in celcius, almost everyone I know gives oven temperatures in fahrenheit, which is funny because cooking always sounds really-really-hot: a 300 degree oven sounds like a LOT, but in celcius its only 150 - actually fairly cold to cook with. This is because so much of our media (like cooking shows, books, magazines, etc) is shared. Yet so few of our small measurements are, so many recipes are given in units people don't always have a lot of experience with. I cannot count how many times i've been at the grocery store looking for an 8fl-oz can of something, and I have to stand there and scratch my head to rough it in mililitres. Oh, and a quarter-pounder here is still a quarter-pounder - come to think of it, all the burger commercials i've ever seen have been in pounds. So much for small measurements in kilo/grams.

Anyways, the TLDR version is that Canada has the most screwed up measurement conventions of any country on the planet, hands down.The day the US switches to metric will be a very, very happy one for all Canadians. Not that i'm holding my breath.;)

It depends whether you're talking about fuzzy math as in Fermi problems or more accurate stuff. Because it's two different processes that happen. The more accurate stuff is more taxing because it requires more attention, memory and care. You can round things at points when you just need a ball park.

11*56= 560+56 or 616. But, if you don't need exact precision, you can let 11 = 10 and add 60ish to the end product. It's not the correct answer of course, however as you add decimal places rounding like that

1. "I think it is because the brain is at heart an analog instead of digital machine. Multiplying integer numbers however isn't a task well suited for analog machines."

The humble slide rule is a beautiful analog computer whose primary job is doing multiplication. A skilled user can do multiplication with one faster than he can use a digital calculator.

2. The brain isn't a digital computer, but it isn't really "analog" either. Individual synapses are either off (not firing) or on (firing), never something in between. But the *rate* at which they fire encodes information in a way that's not analogous to either analog calculating machines or digital computers.

Comparing the human brain to *any* human technology, be it a digital computer or an analog calculator, is a massive category error.

I don't think the distinction is so much between analog and digital as between synchronous and asynchronous. The brain doesn't have a quartz crystal or a cesium atom telling it when a thought is over. It settles on a result, then sets a flag letting you know it's ready to read another input. In the mean time, some tasks take longer than others. Think of it as a CISC machine with no clock pulse and some bus contention maybe rather than a tightly clocked synchronous RISC machine.

Also, it's pretty clear that certain parts of the brain tend to act as special coprocessors or at least NUMA general purpose processors. Your data is moving from one place to another with different locality.

Add to that the fact that to get precision you must let the data circuits settle before relying on them (the purpose of latches and a clock in most traditional computer processors) but that most of our lives are lead in approximations, and it's easy to see why we're poorly constructed to do precise calculations as quickly as approximations.

We can build computers to be much faster at rough approximations and with good accuracy but poor precision than at precise answers, too. We usually don't, except for Non-P and NP problems, because having exact answers quickly is often the main advantage to using a computer.

Getting approximate answers even faster from the computer is only useful in certain situations. Oddly enough, many (but not all by any means) of these situations are things humans are already really good at on our own. The facial recognition used as an example in TFA is one. Maneuvering over rough ground, identifying close to optimal paths for the Traveling Salesman problem for a small number of inputs, or translating speech into text are all things most humans do pretty easily any time. They happen to be really difficult to do quickly with precision whether using a computer or not.

Luckily, we don't have to calculate the force of every footfall when we walk. Getting a close to optimal travelling route is much better than getting one of the worst options. For larger numbers of stops, a computer will do better faster than most humans on this problem, but that's because we know how to make the computer estimate, too. We tend to work with phonemes and with local context when working out the meaning of a sentence, and the best computer dictation and language translation systems (which are still lacking) do a lot of guessing and inferring based on context, too.

We live in a sloppy world. We get mostly sloppy inputs and produce mostly sloppy outputs. Things work out fine most of the time that way, but we need precision for some of our own non-natural projects. Getting precise answers when you don't need them is wasteful of resources. It's no wonder that to survive we're very good at getting sloppy answers quickly. It's no use to wait and figure out which exact angle you need to run away from danger. Close to 180 degrees is pretty good.

Yep. India's Shakuntala Devi (known in those days as The Human Computer) as a girl used to challenge the mainframes of the 70s with such prodigious feats as multiplication of 2 massive numbers, and frequently pointed out correctly that the computer was wrong after assessing its answer.

As usual, nothing was made of this ability aside from its sideshow value, and no studies made of her brain capacity or computational methods.

Last I heard, she's reduced to making a living selling horoscopes and the like, if sh

The brain is not a digital computer in any useful sense. It has no clock, no real concept of "bits", either for data transmission or storage. Its elemental operations are best described in terms of message passing over a network, not in terms of math.

Yes, you can say that it can do tasks that only a powerful computer could perform, but that doesn't mean it's a powerful computer any more than a shark is a very powerful jet-ski. It's not a matter of "not having access" to "low level capability": at a low l

Yes, the underlying constructs are very different. But again, doesn't the concept of abstraction between high and low layers of processing still apply to any system that is made up of any form of discrete operations?

Neurobiology is a fascinating topic. Of course a brain is not a digital. Neurons often have multiple connections (dendrites) and emit more than one type of neurochemical signal and often has more than one type of receptor. However, I can see the point that these neurochemicals are sent out in specific quanta and that a threshold needs to be exceeded to initiate a response. Thus instead of using a neuron as the basic unit but the receptor type as the unit, we can see neurology in a digital aspect. I wou

I'd say it's more that we just need practise to build up the connections to all the different areas of the brain. Much like when solving a mathematics problem, there are so many different ways of looking at the problem. Sometimes it helps to use algebra or to draw a picture or diagram. That makes use of the visual areas of the brain - around 30% in mammalian brains. For simple arithmetic we usually learn times-tables (1x1 = 1, 2x2=2, 3x2=6,... 12x12 = 144). Doing this for certain larger numbers is easy (35

Couldn't it just be that we do not really have direct access to the raw computational capacity of the brain? There are savants and people who have trained themselves tremendously who can do arithmetric like this, using memory tricks and such. Wouldn't that be more like a hack to "reach down" to utilize the low-level capacity of the brain?

Which is to say, it's nothing specific to a brain, but a common occurrence in systems of any kind, including computers (quick, multiply 357 * 289 on your Wii... no, going to the browser and asking Google doesn't count).

People can learn methods to do fast math, and this guy [ted.com] (who specifically points out he's not mentally 'different') has books that teach you his methods.

The parent post has it exactly right. There's nothing new here, wild theories about "routers" and "traffic jams" aside. You can only keep 7 things in your head simultaneously, and multiplying 3-digit numbers takes more memory slots than that.

which results in the weird memory tricks people have developed for doing large number math in your head. breaking down the problem into small chunks, so that you can operate the problem in 7 number chunks and whatnot is what the majority of them end up doing, they just get there by different paths.

They usually aren't memory tricks, what they do is break the problem down into weird rules which work for patterns.

For instance, there are different ways you can multiply 2 numbers under 20 together.

From memory:13 x 19

Take the left hand side and add it to the second digit of the right hand side: 13 + 9 = 22Add a zero behind the result: 220Multiply the second digit from each set together: 3 x 9 = 27Add that to the previous result: 220 + 27 = 247

Under the assumption that those recogntition tasks are inherently memory-intensive, the brain has to have similar amounts of memory at its disposal.

I question the assumption you're making. The nervous system is not a computer in any useful sense: its elemental storage is not in bits, and its elemental operations are not bit logic. To compare its "specs" with a digital computer is to compare apples and oranges.

Example: pitch recognition. How does a computer recognize the pitch of a sound? An incoming aud

Actually, you're not entirely correct, the human brain is much more like old console hardware than a modern computer. Because a lot of that stuff was done on consoles via registers. The programmer didn't have to do anything in particular other than write to or read from the appropriate register to have whatever done.

Such as on the GBA, if you wanted to write to the screen you would select the correct register and give it the correct value, the hardware would do the rest.

Ah! So the 7 items are pointers to some instance at some memory address in the storage part of the brain! Like a computer!

I'm not sure why everyone is getting so pedantic over the definition of computer. Isn't it just a construct that given a set of inputs will give you some output? Maybe there's an expectation that the paths input takes to output are reconfigurable. That kind of starts to sound like a brain. It processes raw data. It's programable. Heck, even algorithms that work in the computer world

And the fact that our conscious short-term memory holds 7 "items", not bits -- the items can be digits, words, names, faces, or objects -- continues to show just how un-like a computer the brain really is.

You know actually, in computers, we call these "pointers" or "references";)

... the way math developed and was taught is not the only way to teach "math", this is one thing that I've learned as I've grown up. And I'm still doing much research in this area.

There are better ways to teach people how to do those computations but it requires a conceptual understanding that there is not a "Set" way of thinking about "numbers" (really our alphabet for communicating distinction and differences) linking the way we naturally think with foreign languages developed by a narrow set of minds. see: Mayan numerals.

Notice how mayan numerals rerepsent themelves as geometric objects that are easily discerned at a glance versus our our highly compressed representational notation (1,2,3,4). Mayans knew that all numbers are made of distinct geometric distinctions and hence they used simple uniform geometric objects as representation to communicate numbers "at a glance", representation _matters_ to how we think about concepts and how we can use them and map them between systems of thinking that only SEEM different on the surface.

You have to understand the numbers can be rethought as natural ratio of shape and size in the real world, when we measure things in the real world we use arbitrary ratios of an object in regards to our own visual system.

For instance 357 by 289 can be broken down to

3.57 x 2.89

What you're trying to do is limited the # of elements by changing the ratio you have to see "lots of things" as merely representations of smaller scale things and things get a lot easier once you understand this principle.

The whole way math is taught is really fucked up and made for a narrow range of particular minds that function and "Get" how our mathematical system developed. If you begin studying the history of math, you realize that representation and HOW YOU THINK about how we mathematize nature matters a hell of a lot more then just throwing stuff other people figured out at kids in a symbolic format developed for a narrow subset of human minds.

Math is just a symbolic language to communicate our observation of distinctions and differences in regards to space, matter and time in the world.

Uh, when I was taught math in elementary school, concepts similar to the Mayan depiction were often used. The only difference I see is that this was all done in base 10 and not in a hybrid of base-5 embedded in base-20.

I'm not really sure what you're getting at. Sure, you can represent numbers as shapes and sizes, but I don't see how this really helps mental math except when it comes to order-of-magnitude calculations.

If I want to multiply 357x289, I can already tell you that the answer is somewhere around 90000. The challenge comes if I want to know the answer to more than 1-2 significant figures. I don't see how using something like the Mayan system or any other system is going to accomplish this.

In any case, I'm not even sure what the problem that you're trying to solve is. The average person can do math well enough to get by in the real world. Sure, it would be nice to be able to walk down the aisle at the grocery store and figure out the per-unit prices in my head to 3 sig figs, but I don't see anything you're offering as accomplishing this. If I'm going to do a model simulation run I'm going to use a computer, and that requires almost zero mental effort around performing calculations - just a TON of creativity and analysis creating the mode/etc.

If you're serious about understanding what I said... It's not something that can possibly be communicated easily without book length treatment and requisite reading of a lot of literature.

Without which, you won't get it because you won't be able to see the relationships because you don't have the requisite conceptual framework in your head to see how different areas link to one another.

But It has to do with how human languages and mathematics basically use a mor

While math is a genuinely complex subject, I still think you should try to briefly elaborate on what you're talking about rather than just dumping links to books. I recently argued with someone about an economics subject. After making many unsubstantiated claims and accusing me of being "conditioned", his side eventually boiled to "watch my ridiculously long video for my argument" (I scrolled through the video, it had some psychedelic stuff, movie outtakes, etc, but not anything I'd consider related to the

"I still think you should try to briefly elaborate on what you're talking about rather than just dumping links to books"

If it were possible I would already have done it, this is why I said it cannot be explained without BOOK LENGTH treatment.

You are still under the illusion that reasoning operates in a UNIVERSAL MANNER, the whole point in reading those books is to see that reasoning is NOT UNIVERSAL i.e. I can know things that are prefectly rational that you cannot understand given the unique configuration

"... the way math developed and was taught is not the only way to teach "math", this is one thing that I've learned as I've grown up. And I'm still doing much research in this area. There are better ways to teach people how to do those computations..."

My argument: Math is not these computations (arithmetic). Math is communicating patterns; it's abstraction, use of variables, theorems and proofs. So frankly, spending any time at all on improved numerical computations is a total waste in the modern era. We re

Thanks for sharing your insight. I have always been very frustrated by my experiences with math education. You seem to be one of the few people who understands that the representational notation isn't the math. The math is an abstract thing. It's "out there in the ether." I have a hunch that many people (myself included!) learned how to manipulate the representational notation without having any but the most vague concept of what the representational notation represents.

In the I, Borg episode of Star Trek TNG, they went a step further in this idea trying to shut down the collective showing them an unsolvable geometric problem. And as with this kind of traffic jams, we can always refuse to go thru this road, or even exit it if took too much time.

For fun I'll record in the slashdb record how I surprised myself and managed to correctly multiply 357 by 289 in my head. I don't have a history of being able to do such things, and it did take me a couple minutes, but I was pretty surprised to see bc confirm my answer.

I went with the route of breaking it down to 357 * 300 - 11 * 357. 357 * 300 is 35700 * 3. Even that is pretty hard so I figured 35700 * 2 = 71400, then + 35700 .... 57 + 14 is 71 thus 1071, then add those 2 0's back on to get 107100.

have you heard of associative processing/memory? That's CS concept with PoC that we can construct machines like human brain, but again, they could analyze face in fraction of second, but multiplying is gonna take some time...

Our memory gets things wrong all the time. We can scan a crowd and associate a face with someone who looks similar. When we multiply a number in our head, we're trying very hard to get the exact, correct answer. Perhaps the brain is just a lot better at fuzzy problems than those demanding strict correctness.

It might also be an input problem. Our number system is a very effective way to communicate math, but it may be a very foreign and sub-optimal way for the brain to process math. Maybe we need a new

And recognizing a familiar face in a crowd might be a good survival skill for a species so it knows when to flee. Where as the math calculations are rarely fatal so no driving force for the species to develop speed at this task. Now if everyone who was slow at the task were to be killed off, then those genetically able to do the math would be left over to pass on those genes.

I am not expert, and this is just from a brief conversation I had in an elective class many years ago with a neural science professor but I asked how it is the brain does things in an instant that would likely take a powerful micro computer most of a day, while simple multiplication is often quite difficult for me to do in my head.

The reason he gave is that the brain works usually in a in precise manor. You have lots of different groups of neurons that your relatively plastic brain has wired up to do things like recognize certain patterns. If enough of those go high other parts of your brain proceed as if there was certainty. That works well for evaluating how hard the sterling wheel is pushing back and deciding how much more to stimulate muscles to contract. When you doing something like math though there is only a very specific correct symbol. They parallelism of the voting system breaks down and your brain how to check that all or almost all of those networks agree.

Here's an analogy to illustrate the category error people make when comparing the human brain to a computer:

"A Sony Walkman can record and play music in realtime, fast-forward and rewind, and store an hour's worth of music. These tasks require a 75 Mhz processor and 100 megabytes of memory on an iPod Shuffle. Therefore, a Sony Walkman has a 75 Mhz processor and 100 megabytes of memory."

Even better: A Sony Walkman can record and play music in realtime, fast-forward and rewind, and store an hour's worth of music. These tasks cannot be done by the human brain, therefore a Sony Walkman has more power than the human brain.

We had a family friend (he has passed now) who could go to a railroad crossing with a train going 60 miles per hour down the track and correctly add the 7 digit (or more sometimes) numbers on each train car as the train passed.

He said that he would not "add" the numbers but allow for them thus coming up with a total more through allowing the right answer than by math manipulation like we would have to do consciously.

The whole thing was sort of spooky to behold... here we were writing down the numbers of each car and he effortlessly knew the running total. It was if he allowed his unconscious part of his brain to observe the number, add it to the running total without interfering with the process mentally and then his conscious mind would retrieve the answer from the unconscious mind at the end of the train or after 20 cars have passed or other terminating choice.

I have experienced this, creating algorithms for computer programming. Some times i dont need to think about it, i just sorta of think about the relevant data and what i want to happen in general, and i can sort of pluck what i need out of my mind.

You can scan a crowded lobby and pick out a familiar face in a fraction of a second, a task that pushes even today's best computers to their limit. Yet multiplying 357 by 289, a task that demands a puny amount of processing, leaves most of us struggling.

Isn't it how you define processing? For a computer it don't make any difference if you process 5*5 or 123565*435456, because the internal representation of the numbers and the hardware paths are basically the same for both tasks. But our brain is a highly specialized device, specialized on tasks that help us to survive in the good old jungle.

So the task to multiply 357 by 289 takes only for our computers a puny amount of processing, because our computers are specialized on this tasks, but for our brain in t

What use was it in our evolutionary development to be able to multiply two three digit numbers? That hasn't been something important to humans on a whole for, at most, two or three thousand years, and we could probably make an argument that it has really been important for more than a few hundred years.

On the other hand, how important has it been to be able to recognize faces in a crowd? Extremely so, at least since we start running around in groups. Gotta know who's friendly and who's not, right?

Asking why we can't do three-digit multiplication quickly even though our brains is complex is sort of like asking why a toaster can't tell you ratios of voltages even though it has resistors in it. It's the difference between what a machine does and how it works. Brains are fabulously complex, but one thing they weren't built for is three-digit multiplication. Does the brain "know" how to do multiplication really really fast? Yes, of course, there are all kinds of things going on in the brain that involve multiplication. Does it know how to do it with numbers that come in through the ears, and spew the answer out through your mouth? No, brains weren't built to do that. They were, however, built (so to speak) to do much more complicated (but different) things, like recognizing threats and understanding spoken language.

I don't know how good the router analogy will turn out to be, but it's not exactly breaking news that some things need attended, more-or-less serial processing, and that mental arithmetic is one of them. The things that don't need as much attention are things that are evolutionarily old and more or less built-in. Extremely overlearned tasks can fake it sometimes. Guys like Hal Pashler and Stan Dehaene are always making progress into understanding how and why these things work, but the idea of processing bottlenecks in cognitive function is very old. The router analogy is probably a bad one, because it's unlikely that the brain's router lives in any very specific place. It's more likely a property of how the brain adapts to tasks it wasn't designed for.

My grandmother, while she still was alive, could do these kinds of tricks in her head in a few seconds. She could multiply 2 and 3 and 4 and 5 digit numbers, divide and even take roots. All in her head. The day she finished high school the war started, so instead of becoming a teacher she was making tank gun rounds and then after the war worked as a food store clerk and then an accountant and the head accountant for a number of stores at the same time (this was the old USSR). Most of her life she was around numbers. So in the stores even until 1980s they didn't calculators or electronic machines, they used abacus. She calculated everything in her head in seconds and told the result, the buyers would not believe her and ask her to show them on the abacus, so she did. I cannot say that I ever heard her being wrong about calculations.

I believe she remembered a lot of the calcuations ahead of time, so she nearly knew the results (pre-cached the results) and then worked the small differences out. I don't have that cache of numbers, but 2 and 3 digit numbers I can do fairly quickly.

289 and 357 to me is (3570 - 357) + (35700 - 3570 * 2) + 35700 * 2. So the only difficulty here is making sure I don't screw up the subtractions, and those are just a matter of paying attention.

I don't know about that, it's does make some sense, it allows us to use visual memory to do calculations (like when we play chess without the board). I actually can imagine an abacus, it's nearly the same as imagining hands and fingers, it's easy to use that to do binary by the way.

But in case of my grandmother, she remembered a LOT of numbers just like that, because you know, decades of experience all around numbers.

I think I saw the PBS special that covered what was mentioned. There is a school in Asia (Japan? China? India? Don't remember, it has been a while since I saw the special) where the students are started at a young age using an abacus. They learn to do complex calculations quickly. Once they read a high speed, they take away the abacus and let the students use an imaginary one. Stage 3? They begin limiting the finger twitching until the abacus exists only in the visuospacial sketchpad and "muscle memory". Although more challenging for an adult learner, with enough years even an adult could learn this method. The advantage of the abacus is manipulating larger numbers than some of the "finger" tricks - but essentially these schools reduce them to just that, minor finger twitches that trigger a mental image of an abacus.

Chunking to optimize usage of working memory is pretty impressive. Think about how we teach kids to decompose the problem of 289 * 357. We essentially tell them to break it into 4 problems x = 289 * 7, y = 289 * 5 * 10, z = 289 * 3 * 100, and x + y + z. However, we then teach student to do the same with each of the 3 subproblems of 4 calculations (289 * 7 is a = 9 * 7, b = 8 * 7 * 10, c = 2 * 7 * 100 and so on). Thus we have 13 problems to solve while the typical range of items in working memory is 5-9. By creating the mental abacus, the person conducting the calculation now has it fit inside the limits of the working memory.

I could not do the problem mentally. However, when I looked at it I said 289 * 357 is about 300 * 350, or just under 105000 ( 11 overestimation is greater than the 7 underestimation of two similarly sized numbers, so I would expect to be over slightly in my estimate). For most cases where mental calculation is needed, an approximate 3% error isn't too bad.

Which gets the last three digits all in one place. If you can remember the three digit answer to 3*46 there's not much else to remember except the term you're working on while producing either the top three or bottom three digits in isolation.

Look for a book called "FingerMath." It teaches how to use your fingers like an abacus. After you get used to it you can stop moving your fingers and just kind of "feel" the calculations. No, I never practiced it enough to get good at it. But it is a pretty good book.

Well, during the war there were plenty women in armed forces, you couldn't really tell them 'no', but she wasn't in the army, she was working like most women and children at a factory, building weapons.

He said making tank gun rounds. That was pretty common during the war, the US had Rosie the Riveter as propaganda of that kind of role.

In the USSR they served in combat, too. It was accepted somewhat reluctantly, but quite a few volunteered, and initial losses gave a reason for giving it a try. They turned out to make really awesome snipers [wikipedia.org].

While it wasn't a feminist paradise, in the mid-20th-century the USSR was in many ways far more open for women than the US was, a by-product of Soviet political ideology. That was part of the cultural "evil" that it represented to conservative Americans.

While your grandmother may have had her own way of doing this, complex calculations can be done very quickly using the Trachtenberg system of mathematics [wikipedia.org].

I actually have the book and swore to myself that (while I didn't need those computational skills) my kids would be taught it... my first is on the way now so I guess it's time to dust it off (the book... not the child).

This is how is works in my head. I can look at the problem and tell you almost instantly that the answer is about 100,000 (33,333 1/3 * 3)and in a few seconds tell you that the answer is probably closer to 105,000 (35,000 * 3). Are these anwser 'close enough' to solve the real problem at hand?