Why technology might not make children stupid, after all

This is a very old argument, dating back (at least) to 370-ish BC, when Plato wrote the The Phaedrus. Like the better-known Republic, Phaedrus is written as a conversation between the character of Socrates and other people. At one point, Socrates tells a legend of an Egyptian god who invents writing and tries to give the gift of the written word to a wise king. The king is ... less than enthused.

For this invention will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory. Their trust in writing, produced by external characters which are no part of themselves, will discourage the use of their own memory within them. You have invented an elixir not of memory, but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom, for they will read many things without instruction and will therefore seem to know many things, when they are for the most part ignorant and hard to get along with, since they are not wise, but only appear wise.

Basically, all these damn books are going to make the kids dumb. This is usually my go-to story that I bring up whenever somebody is fretting too much about how the Internet will totally make kids stupid. But journalist Annie Murphy Paul has found an even better argument against techno-fear. At her blog, she quotes an interview with Jay Giedd, a researcher at the National Institute of Mental Health:

Interviewer: So how well are our children handling multi-tasking in a digital age that changes, seemingly, by the hour? Early evidence suggests: pretty well. In fact, the human brain has a track record of successfully adapting to challenges it wasn’t initially designed to take on—such as reading.

Giedd: It’s sobering to realize most humans that have lived and died have never read. And so, we’ve been able to change what our brain does based on having the written word and having this environment. And so now the question is will we be able to change to keep up with the new flood of information coming from all kinds of sources. And up until now the human brain has done a great job of changing—adapting to these environments, but there are limitations to this capacity. And so it will be very interesting to see that these so-called digital natives… the children that have grown up never not knowing the multimedia devices… whether their brains will be able to adapt differently than older people.

You can read a larger excerpt of the interview on Paul's site, but the general gist is this: We might be missing the point when we worry about whether technology has gotten ahead of what our brains evolved to do. What our brains evolved to do is adapt. New technologies change the way we think—the shift from memorization to reading certainly did that. But that's not the same thing as making us stupid or stifling our capacity for creative thought. Instead, we take these tools and we find new ways to be creative. We take the tools and we use them to expand our knowledge of the world. It's what we did with books. Maybe we'll do the same thing with the Internet-rich, multi-tasking world we're building now.

“they are not wise, but only appear wise” Obviously there’s nothing wrong with collecting knowledge outside our collective brains, but there’s something to this. People think they’re well-informed and acting on ‘factual’ information when they say, read something on Boing Boing or hear it on the news, and these things aren’t necessarily true. We’re most of us regurgitators, who don’t conduct our own critical analysis of the information deluge we receive daily. The wise king was onto something…

Memorizing it based on being told it is no different to memorizing it from a book. The method of transmission isn’t the problem, it’s the source you get the data from.

The first great leap from writing was a sudden reduction in transmission errors in passing that knowledge along; the second great leap was making that knowledge transfer asynchronous and without temporal bounds.

None of that helps if the person you’re getting a knowledge transfer from is an idiot.

I was just talking about how books, wikis, etc. are all very well and good, but they’re useless if you blindly accept them as truth. Technology isn’t the problem (I’m the last guy to hate on technology) but teaching critical thinking, independent study, etc. is definitely a problem we have! Just think of how many computer classes teach “How to use Office” – in a year that knowledge is useless. I’ve got a shelf full of 90s programming books that are now essentially useless. We need to be teaching kids how to learn, that is, how to effectively leverage technology to figure shit out for themselves. We put too much focus on learning facts, we ought to be focusing on learning how to delve down and get as close to the real truth as possible, using every tool at our disposal. High school kids graduate with a bunch of dates and mathematical and chemical formulae memorized… but half of them can’t formulate an effective Google search. See what I’m sayin’ ? :)

It depends on the subject. Good science instruction should be about analysis from the very beginning. The final stages of good literacy education are all about analysis but in the beginning children have to memorize basic words. I’m a foreign language teacher and my subject is requires ongoing intensive memorization in order to get to application which is the mental operation the heart of my subject.

I picked that photo because it was the first shot I found on Flickr where there was more than one kid, and you couldn’t see what they were looking at on the screen (I didn’t want to have the comments full of critiques of what the kids in the photo were using tech to do).

I probably should have considered the subtext here though, too. My apologies.

Minecraft is actually a great example of how kids can get up to good things online. I run a minecraft server for my son and his friends. They have collectively built some amazing things in there, collaborating all the way.

My now 12 year old has been doing the same thing for over a year. Then he shows me his creations, some of the elaborate obstacle courses. And because he plays on a local server run by a “friend” who leans towards bullying tendencies online, my son has to hide his creations or at least egress to his creations. Yet more problem solving. His use of the limited available resources is pretty remarkable to me, and I have a very active problem oriented way of thinking.

Isn’t the issue that new technologies blind us to the advantages of the old ones? Someone with a truly prodigious memory has advantages over someone who does not, and can still learn to read and write. And if they do read, and they’ve got an extensive memory of classical literature, they’re far more likely to understand the allusions in the text they’re reading, and so have a richer understanding of the text and subtext. This isn’t even dependent on the author having a conscious understanding of the allusions in their own text – language organizes itself into tropes that we use freely, but those with a memory of many re-uses of a trope may know more about what the author is saying than the author who uses those tropes naively.

New technology is usually presented as a reason not to use the old one, and that’s the issue, as I see it.

I find that being someone who grew up using the internet in some capacity has caused me to be very good at remembering certain things and not so great at remembering others.

I can remember how to do something very well, or figure out how to do something challenging. But my memory for people, places, and things is completely horrible because the knowledge available on the internet has replaced this necessity. The words I no longer need because they’re just a quick google away.

This doesn’t mean that I’m unintelligent by any means but it means that I seem unintelligent or forgetful in certain situations, but put me in the right spot and I’ll fly by most.

I sometimes wonder if taking a break from the internet would help this. It’s something to consider for ourselves and our children.

I feel that every technological environment comes with its tradeoffs because we as humans are always honing some set of skills while we ignore others. While I outsourced my need to remember particular events and details, I was able to focus that effort instead on building conceptual models which seem to make learning easier.

For instance, I never took notes in my AP physics class, but instead focused all my effort on making each lesson fit into my overall understanding of mechanics and electromagnetism. If I forgot an equation or technique, I knew I could find that information online. I was the only person who got 5s on both of the tests.

This particular way of learning is now proving to be useful for web programming, where the details are always changing but the ideas are more or less staying the same. I learn C, and I now mostly know how PHP and Javascript works. I learn a little bit about how information is transferred and I now understand the basics of securing data. If there is anything I have trouble seeing, I can usually experiment with it right there.

The tradeoffs may be asymmetric however, I’ve certainly screwed myself out of stuff like statistics, where the whole point is you can’t see the whole picture without processing the details, and I have a hell of a lot of trouble writing because the concepts that go into it are really loose. I sometimes wonder how much an edge I actually have.

This is also a parable about the roots of knowledge, and the importance and the workings of dialog between people. Few books bring into play the full weight of dialog, where it is defined as oral communication and the only or the principal tool for interaction. For that matter, few media products do either, whether we are pointing to the internet, or not. Please don’t misread an important parable on the need to refer constantly back to oral tradition, as the golden standard for writing, or for any form of culture. It would be like misconstruing the history of the printing press, or for that matter, of WYSIWYG.

Although I am not lettered in the subject quite yet, I feel like there is certainly something to say for the fact that non-electronic children’s games (human or otherwise) tend to ape necessary survival skills, and that actual physical action is required to develop, strengthen, and maintain the neural networks required to perform these actions in a seemingly intuitive manner. I am without child, so I’d like to highlight that I’m speaking solely from a place of conjecture on this one–god knows what children get up to these days, but there is a vast cognitive difference between playing a first person shooter type of game, and say, actually learning how to shoot a bow and arrow or a firearm or just good old fashioned hurling rocks at power lines (sue me, there was very little to do in growing up in Ohio during the 80s).

Also, I think it is worth mentioning that while children learning to interact through an electronic medium does provide valuable skills w/r/t our currently technologically oriented culture, we should also keep in mind (as Ms. Koerth-Baker appears to be fond of pointing out) that not only are electricity and electronic devices far from ubiquitous in terms of human experience, but that even the infrastructure that carries the life-blood necessary for this culture to perpetuate itself exists on terrifyingly shaky ground.

I’m not opposed to technology–language and my toilet are two of my favorite things in the world–and I certainly don’t mean to come off as libertarian, but if you can’t support yourself, if you can’t survive in the woods, then how much do you really know?

We live in the near city burbs, so Scouting has given my 12 y.o. son opps with bows, bb guns, hatchets, fire-making, cooking meals on said fire, etc. And his 5 years of soccer help with spatial relations, vectors, athletics, team play and sweat. His propensity to learn new vid games without using the tutorials is pretty amazing to watch, and I have yet to uncover any downsides to his development. And, I might add, I do enforce a 30 minutes per day rule for good old fashioned book reading for pleasure.

I’m 24, and while I’m not of the generation that is growing up with the established web, I have been using a computer since the age of 3 or 4, and using the internet since around age 8 (although it wasn’t really until middle school that I got really into the web or computers). Looking around at my peers, I feel like we have enormous advantages because we have grown up with so much technology available.

If I have a question, I Google it. This is second nature for my peers and I. I learned how to type by chatting with my friends. In college, when presented with an assignment I knew nothing about, I would turn to Wikipedia, and 5-10 minutes later, have enough of an understanding of the topic to go off and easily find peer-reviewed sources.

The road to knowledge is becoming shorter with less stops along the way, and that can only be a good thing.

You kids these days are spoiled! Why, in my day the internet was made out of bones and rocks. You had to fight a Tyrannosaurus just to find out who wrote “The Birthday Party.” Except we called him the End Boss.

I feel like this obscures a different question, which is whether the ubiquitous technology is helping raise an ADD generation.

Just from my sample of urban, educated 30-somethings, I think many of us have slight issues with technology addiction or ADD — look at me, I’m writing in a BoingBoing comment instead of programming. And this is for people who did *not* grow up with internet at all. The closest people my generation got to “needing to check your email/texts every ten minutes” was probably playing with a tamagotchi at age 12.

Will it be worse for kids growing up with cellphones, email, and online forums (i.e. YouTube comments) that you need to return to every ten minutes and rebut the latest dunce who is debating you?

Certainly high schools already have problems with kids checking their cell phones under their desks every ten minutes.

So whether or not they are *actually* dumber/smarter doesn’t seem to be the issue, to me. My question is, will any reasonable percentage of these kids be able to sit down and tackle a serious, hard, weeks-long problem that requires focus and concentration?

Plato’s text makes more sense when considered within the context of the memory methods of classical rhetoric. Theses methods were not rote memorization. Rather they were creative visualization techniques combined with a kind of practical, non-linear way to remember the components of an argument. See Francis Yates’ book The Art of Memory for a fascinating look at how these techniques were applied in the renaissance.

I agree with you Maggie that it’s important not to have a knee jerk reaction to technology. But we do need to look critically at the notion that every adaptation to new tech is necessarily good. And we can also learn from the past to better refine technology. What if, let’s say, PowerPoint had the fluidity of a classical memory palace, i.e. could change on the spot to adapt to the mood of an audience or a particularly pointed question? What if GPS technology in smart phones also taught us the subtle clues that indigenous people use to navigate (think of the ancient Polynesian’s ability to read water currents to navigate between far flung islands without a compass or map)? What if technology simultaneously helped us internalize memory as well as store it externally?

Why are there references to our brains not being “evolved” enough to handle it? The interviewer states that: “In fact, the human brain has a track record of successfully adapting to challenges it wasn’t initially designed to take on—such as reading”. “In Fact,” what was our brain’s “initial design”? Obviously, by reading this, it was in the ‘design’.

And what is technology ahead of? What’s the comparison? Number of words stored, the number of math calculations, lifting heavy things? Writing a novel, a song, a joke? Painting a picture, designing a bridge or a Gothic cathedral?
Sure, it can crank out a gazillion bits of information at us, but what our brain is uniquely designed is to filter and to pattern match. If it’s useful, we pay attention, if it’s not we ignore it. The more you are exposed to the noise out there (and true that’s a lot and the basic point of the article) the more your brain knows what to ignore.

Now, if you argue about the ‘dependency’ on technology, say GPS, cameras, kindles, etc. I fully agree, but that’s more a choice on our part, and likely a bad one. Personally, I think map reading should be a required class in high school :)

It’s more that the brain gets use to ignoring information and assumes that it’s just a google away. As a result you start to get forgetful and it becomes harder to remember things over all even when you know you have to remember it.

And the brain hasn’t evolved much in the last couple of thousand years and I can only assume was designed to see fast things comming at us, communicating with others of our species and complex abstract thought that helped us build tools. It really a huge question better left to evolutionary biologists to answer.

Isn’t it important to consider how quickly new technologies become dominant? Moving from an oral to written culture took many generations. The digital revolution has happened in less than one generation.

Hmmm, the whole argument feels like hair-splitting to me, a syndrome that I particularly loathe. I mean, what is really the difference between “stupidity” and “unconciously being taught to become bored faster?”

Nicholas Carr presents a pretty compelling argument in “The Shallows” that we’re not becoming dumber as a result of technology, but our brains are becoming less able to do deep thinking. Also, it’s not just the kids, it’s everyone who uses the internet and modern technology.
The culture of distraction is changing the way our brains work.

Agree or not, it’s a good read and it pretty much scares the hell out of me.

This is nothing more than another incarnation of, “These damn kids today have no discipline. We were never that dumb/shallow/lazy.”

Technology doesn’t make us stupid, it makes us adapted to different needs. Google, Wikipedia and the rest of the internet makes perfect, encyclopedic recall less important than the ability to recall overarching principles and concepts that we can then do a detail search for.

Is that a different adaptation? Certainly. Does it make functioning in a different environment more difficult? Certainly, but all of life is like that. We can’t breathe water, either – nor survive long in the presence of significant carbon monoxide. That doesn’t, however, make our lungs “worse” than gills or labyrinth organs. Our adaptation comes with a cost. The fact that one changes on the scale of eons, and the other on the scale of generations does not change that simple reality.

Much as we might pretend otherwise, Calculus and The Gallic Wars in high school has always been the province of the wealthy, and the gifted – never the norm. There have always been deep, insightful thinkers, and there have always been shallow, indifferent ones. Newton, Copernicus, Einstein, Bohr, Turing, Berners-Lee have ALWAYS been the exception, not the rule, and it will not change because of the technologies we use to drive our societies.

If we are getting dumber, it is not because of technology. It is because we are focused on stuffing heads full of facts instead of teaching them how to think.

Technology doesn’t make people dumber. But it does CHANGE people, and the argument that it is changing people for the better is flimsy. For every piece of progress you can thank technology for (i.e. the printing press and widely available books) you can find a big overall loss to humanity as well (people could memorize long, long passages of story, or sequences of music, allowing them to become much more intimate with the work than we ever get today.)

Since boing boing loves to use historical examples to prove their point, here’s one- just like we rushed into the Industrial Age thinking the mass production of machines was a brilliant step forward, and are now realizing all the terrible consequences of such actions, so is our current love affair with personal gadgets doomed as we will sometime in the future realize, too late, perhaps, that, while we’ve adapted well to managing the over-bombardment of information at our fingertips, our minds are a lot less healthy and lack a necessary balance and perspective of what is important that previous generations enjoyed without even realizing it. There are a lot of already-clear disadvantages to letting your kids rely on computers as a daily way of life- I don’t need to wait for the research to prove me right. Being able to think critically and creative does not depend on a computer- and, in fact- is hampered by the limited scope of the virtual world when compared to the immensely varied scope of the natural one. Good luck beating nature at its own game.