* “Do We Really Need Negative Book Reviews?” I tend to answer “Yes, with qualifications,” and indeed I write many fewer negative reviews than I once did. Then again I write many fewer reviews in general than I once did.

How could DuckDuckGo, a tiny, Philadelphia-based startup, go up against Google? One way, he wagered, was by respecting user privacy. Six years later, we’re living in the post-Snowden era, and the idea doesn’t seem so crazy.

“Despite being a denizen of the digital world, or maybe because he knew too well its isolating potential, Jobs was a strong believer in face-to-face meetings.” That’s from Walter Isaacson’s biography of Steve Jobs. It’s a strange way to begin a post about notebooks, but Jobs’ views on the power of a potentially anachronistic practice applies to other seemingly anachronistic practices. I’m a believer in notebooks, though I’m hardly a luddite and use a computer too much.

The notebook has an immediate tactile advantage over phones: they aren’t connected to the Internet. It’s intimate in a way computers aren’t. A notebook has never interrupted me with a screen that says, “Wuz up?” Notebooks are easy to use without thinking. I know where I have everything I’ve written on-the-go over the last eight years: in the same stack. It’s easy to draw on paper. I don’t have to manage files and have yet to delete something important. The only way to “accidentally delete” something is to leave the notebook submerged in water.

A notebook is the written equivalent of a face-to-face meeting. It has no distractions, no pop-up icons, and no software upgrades. For a notebook, fewer features are better and fewer options are more. If you take a notebook out of your pocket to record an idea, you won’t see nude photos of your significant other. You’re going to see the page where you left off. Maybe you’ll see another idea that reminds you of the one you’re working on, and you’ll combine the two in a novel way. If you want to flip back to an earlier page, it’s easy.

The lack of editability is a feature, not a bug, and the notebook is an enigma of stopped time. Similar writing in a computer can function this way but doesn’t for me: the text is too open and too malleable. Which is wonderful in its own way, and that way opens many new possibilities. But those possibilities are different from the notebook’s. It’s become a cliche to argue that the technologies we use affect the thoughts we have and the way we express those thoughts, but despite being cliche the basic power of that observation remains. I have complete confidence that, unless I misplace them, I’ll still be able to read my notebooks in 20 years, regardless of changes in technology.

In Distrust That Particular Flavor, William Gibson says, “Once perfected, communication technologies rarely die out entirely; rather, they shrink to fit particular niches in the global info-structure.” The notebook’s niche is perfect. I don’t think it’s a coincidence that Moleskine racks have proliferated in stores at the same time everyone has acquired cell phones, laptops, and now tablets.

In The Shallows, Nicholas Carr says: “The intellectual ethic is the message that a medium or other tool transmits into the minds and culture of its users.” Cell phones subtly change our relationship with time. Notebooks subtly change our relationship with words and drawings. I’m not entirely sure how, and if I were struggling for tenure in industrial design or psychology I might start examining the relationship. For now, it’s enough to feel the relationship. Farhad Manjoo even cites someone who studies these things:

“The research shows that the type of content you produce is different whether you handwrite or type,” says Ken Hinckley, an interface expert at Microsoft Research who’s long studied pen-based electronic devices. “Typing tends to be for complete sentences and thoughts—you go deeper into each line of thought. Handwriting is for short phrases, for jotting ideas. It’s a different mode of thought for most people.” This makes intuitive sense: It’s why people like to brainstorm using whiteboards rather than Word documents.

I like to write in notebooks despite carrying around a smartphone. Some of this might be indicative of the technology I grew up with—would someone familiar with smartphone touchscreens from age seven have sufficiently dexterous fingers to be faster than they would be with paper?—but I think the obvious answer to “handwriting or computer?” is “both, depending.” As I write this sentence, I have a printout of a novel called ASKING ANNA in front of me, covered with blue pen, because editing on the printed page feels different to me than editing on the screen. I write long-form on computers, though. The plural of anecdote is not data. Still, I have to notice that using different mediums appears to improve the final work product (insert joke about low quality here).

There’s also a shallow and yet compelling reason to like notebooks: a disproportionate number of writers, artists, scientists, and thinkers like using them too, and I suspect that even contemporary writers, artists, scientists, and thinkers realize that sometimes silence and not being connected is useful, like quiet and solitude.

Westerners have long been keenly interested in horology, as David Landes, an economic historian, points out in Revolution in Time, his landmark study of the development of timekeeping technology. It wasn’t the advent of clocks that forced us to fret over the hours; our obsession with time was fully in force when monks first began to say their matins, keeping track of the hours out of strict religious obligation. By the 18th century, secular time had acquired the pressure of routine that would rule its modern mode. Tristram Shandy’s father, waiting interminably for the birth of his son, bemoans the “computations of time” that segment life into “minutes, hours, weeks, and months” and despairs “of clocks (I wish there were not a clock in the kingdom).” Shandy’s father fretted that, by their constant tolling of the hours, clocks would overshadow the personal, innate sense of time—ever flexible, ever dependent upon mood and sociability.

The revolution in electronic technology is wonderful in many ways, but its downsides—distraction, most obviously—are present too. The notebook combats them. Notebooks are an organizing or disorganizing principle: organizing because one keeps one’s thoughts, but disorganizing because one cannot rearrange, tag, and structure thoughts in a notebook as one can on a screen (Devonthink Pro is impossible in the real world, and Scrivener can be done but only with a great deal of friction).

Once you try a notebook, you may realize that you’re a notebook person. You might realize it without trying. If you’re obsessed with this sort of thing, see Michael Loper / Rands’ Sweet Decay, which is better on validating why a notebook is important than evaluating the notebooks at hand. It was also written in 2008, before Rhodia updated its Webbie.

Like Rands, I’ve never had a sewn binding catastrophically fail. As a result, notebooks without sewn bindings are invisible to me. I find it telling that so many people are willing to write at length about their notebooks and use a nominally obsolete technology.

Once you decide that you like notebooks, you have to decide which one you want. I used to like Moleskines, until one broke, and I began reading other stories online about the highly variable quality level.

So I’ve begun ranging further afield.

I’ve tested about a dozen notebooks. Most haven’t been worth writing about. But by now I’ve found the best reasonably available notebooks, and I can say this: you probably don’t actually want a Guildhall Pocket Notebook, which is number two. You want a Rhodia Webnotebook.

Like many notebooks, the Guildhall starts off with promise: the pages do lie flat more easily than alternatives. Lines are closely spaced, maximizing writable area, which is important in an expensive notebook that shouldn’t be replaced frequently.

I like the Guildhall, but it’s too flimsy and has a binding that appears unlikely to withstand daily carry. Mine is already bending, and I haven’t even hauled it around that much. The Rhodia is somewhat stiffer. Its pages don’t lie flat quite as easily. The lines should go to the end of each page. But its great paper quality and durability advantage make it better than the alternatives.

The Rhodia is not perfect. The A7 version, which I like better than the 3.5 x 5.5 American version, is only available in Europe and Australia, which entails high shipping costs. The Webbie’s lines should stretch to the bottom of the page and be spaced slightly closer together. The name is stupid; perhaps it sounds better in French. The notebook’s cover extends slightly over its paper instead of aligning perfectly. Steve Jobs would demand perfect alignment. To return to Isaacson’s biography:

The connection between the design of a product, its essence, and its manufacturing was illustrated for Jobs and Ive when they were traveling in France and went into a kitchen supply store. Ive picked up a knife he admired, but then put it down in disappointment. Jobs did the same. ‘We both noticed the tiny bit of glue between the handle and the blade,’ Ive recalled. They talked about how the knife’s good design had been ruined by the way it was being manufactured. ‘We don’t like to think of our knives as being glued together,’ Ive said. ‘Steve and I care about things like that, which ruin the purity and detract from the essence of something like a utensil, and we think alike about how products should be made to look pure and seamless.

I wish the Rhodia were that good. But the Rhodia’s virtues are more important than its flaws: the paper quality is the highest I’ve seen, and none of the Rhodias I’ve bought have broken. If anyone knows of a notebook that combines the Rhodia’s durability with the qualities it lacks, by all means send me an e-mail.

EDIT: See also Kevin Devlin’s The Death of Mathematics, which is about the allure of math by hand, rather than by computer; though I don’t endorse what he says, in part because it reminds me so much of Socrates decrying the advent of written over oral culture, I find it stimulating.

Design is hard to do. Design is not art. But design has some of the requirements of art. The achievement of greatness in art or design requires passionate virtuosity. VIRTUOSITY means thorough mastery of craft. PASSION is required to focus human effort to a level that transcends the norm. Some guitarists have passion, especially young ones. Some have virtuosity, especially old ones. Some few have both at once, and during some mortal window of superb achievement, they are great guitarists.

That’s from Bruce Sterling’s Shaping Things, and I admire the distinction between design and art, which overlap to some extent but not totally; his point about “passionate virtuosity” is one I’ve seen elsewhere but is worth repeating, because it seems like so many seemingly different fields require the same thing. Certainly writing does, and one sees too many people with the passion or the virtuosity but not both.

Another sample:

I do write a great deal about technology. That became my theme as an artist. The human reaction to technological change—nothing interests me more. I want and need to know all about it. I want to plumb its every aspect. I even want to find new words for aspects of it that haven’t as yet been described.

I would guess artists, especially of narrative arts, are going to have to pay steadily more attention to technology: it informs too many lives too much to ignore, and people have as many disparate response “to technological change” as they do to love.

The book itself—Shaping Things—is interesting without being captivating. It needs more examples and case studies, and fewer grand pronouncements; it resembles a lot of literary theory in this way. If you get a physical copy, you’ll also find terrible design, with all kinds of doodads, weird fonts, random backgrounds, and so forth, all of which distract from readability in the name of being weird (those capitalizations in the blockquote above are in the text). It’s a kind of anti-Apple product.

The book’s design is distinctive, but distinctive is automatically good, and as a mechanism for transferring ideas via text Shaping Things isn’t optimal because of those distractions. Nonetheless, the idea density is high, and I’m going to keep my copy, at least for the time being. Like Sterling, I’ve become steadily more interested in design and what design says about people and culture. I’m not sure how that’ll work into my fiction, but long-simmering ideas and interests tend to emerge in unpredictable ways. For example: I’ve thought about a novel in which a camera shows an emotionally stunted photographer—along the Conrad and Houllebecq lines—who thinks in the language of photography itself what the photographer takes to be the future. Or is it? Photographers have a rich array of metaphors to draw on, and they have to be attuned to light, shapes, and the interplay of things and colors. Cameras themselves are technologies, and in the last 15 years they’ve become computers, with rapid advancements from year to year and all of the technolust that implies.

I don’t know where this idea might go, or if it will go at all, but I’ve been mulling it for a long time. A character like the one or ones I’m imagine would be reacting to technological change. I won’t say “nothing interests me more,” as Sterling does, but human reaction to technology is certainly up there, as I increasingly think it has to be, for people in virtually any field, if one wants any real shot at understanding what’s going on.

Design is hard to do. Design is not art. But design has some of the requirements of art. The achievement of greatness in art or design requires passionate virtuosity. VIRTUOSITY means thorough mastery of craft. PASSION is required to focus human effort to a level that transcends the norm. Some guitarists have passion, especially young ones. Some have virtuosity, especially old ones. Some few have both at once, and during some mortal window of superb achievement, they are great guitarists.

That’s from Bruce Sterling’s Shaping Things, and I admire the distinction between design and art, which overlap to some extent but not totally; his point about “passionate virtuosity” is one I’ve seen elsewhere but is worth repeating, because it seems like so many seemingly different fields require the same thing. Certainly writing does, and one sees too many people with the passion or the virtuosity but not both.

Another sample:

I do write a great deal about technology. That became my theme as an artist. The human reaction to technological change—nothing interests me more. I want and need to know all about it. I want to plumb its every aspect. I even want to find new words for aspects of it that haven’t as yet been described.

I would guess artists, especially of narrative arts, are going to have to pay steadily more attention to technology: it informs too many lives too much to ignore, and people have as many disparate response “to technological change” as they do to love.

The book itself—Shaping Things—is interesting without being captivating. It needs more examples and case studies, and fewer grand pronouncements; it resembles a lot of literary theory in this way. If you get a physical copy, you’ll also find terrible design, with all kinds of doodads, weird fonts, random backgrounds, and so forth, all of which distract from readability in the name of being weird (those capitalizations in the blockquote above are in the text). It’s a kind of anti-Apple product.

The book’s design is distinctive, but distinctive is automatically good, and as a mechanism for transferring ideas via text Shaping Things isn’t optimal because of those distractions. Nonetheless, the idea density is high, and I’m going to keep my copy, at least for the time being. Like Sterling, I’ve become steadily more interested in design and what design says about people and culture. I’m not sure how that’ll work into my fiction, but long-simmering ideas and interests tend to emerge in unpredictable ways. For example: I’ve thought about a novel in which a camera shows an emotionally stunted photographer—along the Conrad and Houllebecq lines—who thinks in the language of photography itself what the photographer takes to be the future. Or is it? Photographers have a rich array of metaphors to draw on, and they have to be attuned to light, shapes, and the interplay of things and colors. Cameras themselves are technologies, and in the last 15 years they’ve become computers, with rapid advancements from year to year and all of the technolust that implies.

I don’t know where this idea might go, or if it will go at all, but I’ve been mulling it for a long time. A character like the one or ones I’m imagine would be reacting to technological change. I won’t say “nothing interests me more,” as Sterling does, but human reaction to technology is certainly up there, as I increasingly think it has to be, for people in virtually any field, if one wants any real shot at understanding what’s going on.

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.

* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.

But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements.

There is no silver bullet and no New Jesus for education. There never will be, but the search goes on, because it’s easier to search for the magic methodology that will solve seemingly intractable problems than it is to admit the thing Google, Facebook, and Microsoft have realized about software engineers: talent, motivation, and tenacity vary greatly among individuals and that you can’t merely take someone who lacks all three, put them through some kind of system or give them some kind of tool, and expect everyone to be equally good on the other side. That kind of thing works passable well if you’re building widgets on an assembly line, but it works terribly for any kind of creative or intellectual work.

Nonetheless, it’s much easier to search for that magic methodology to improve very old skills that are surprisingly resistant to methodology: reading, writing, and math. None of those fundamental skills has changed much in the last century. Yet we keep searching for that formula that doesn’t exist, because teaching and learning are inherently hard, like software engineering, math, writing, and any number of other complex but vital skills. Technologies—now we have special blackboards! They blink! They light up! They’re new!—might mask the essential difficulty of task, but they can’t remove it, much as some programming IDEs try to hide some of the essential difficulty of coding. But fields that are essential difficult can’t be mastered by sleight-of-hand or new whizzy gadgets.

They can only be mastered by people who are dedicated to the craft and to continuous self-improvement. The people who, because they believe they can, can. The ones whose tenacity is boundless and who aren’t willing to blame external circumstances.

You need a weird set of skills to teach effectively: you need to empathize with your students—to be able to see from their point of view—without becoming mired in their point of view. You need to master your field, but not in such a way that you lose the beginner’s mindset required to master the field in the first place. You need the stoic’s attitude of realizing you can’t control everything while still having the achiever’s mindset that you must strive to do the best you can, no matter what. You need to be willing to try new things and ideas while not leaving behind the old ones that work. You need to remember not everyone is interested in the things you’re interested in, and you need to do whatever it takes to make subjects more interesting than they would be otherwise. You need to find that profitable zone of challenge for most students—something hard enough to make them struggle but not so hard that it’s impossible to accomplish—it’s reasonable to expect college freshmen to be able to read a story or article on their own, but it’s not reasonable to expect them to pick up and digest every nuance on their own. Some will. Most won’t. You need to be enthusiastic, because enthusiasm is as contagious as boredom, but your job isn’t to be a cheerleader and enthusiasm can’t substitute for knowledge. You need, in other words, a bunch of paradoxical traits that balance each other.

You also need to realize that students need things broken down in steps, and need to learn by example and through discussion. Last week I taught Neal Stephenson’s 2005 New York Times opinion piece, “Turn On, Tune In, Veg Out.” Whenever I do, I let the class talk for a while at the beginning; when discussion dies, I ask students to do a simple activity: write the essay’s main point in a sentence or two. Then I come around and look at the sentences.

It should be simple, right? Read the piece, find the main point. But it’s not simple. It’s actually quite hard, and most people are bad readers (myself included). When I go around and look at sentences, lots of students get caught up on the distinction between geeking and vegging out. Others think the piece is primarily about Star Wars. Only a few—usually around five of fifty—get the essential elements of the main point.

Stephenson basically says, twice, that he’s using Star Wars as a metaphor: once in the third paragraph: “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” and once more in the last paragraph: “If the “Star Wars” movies are remembered a century from now, it’ll be because they are such exact parables for this state of affairs” (emphasis added). But most students haven’t learned how to think metaphorically, as writers do. Metaphor is one of those essential ways of thinking that people need to be effective writers. In On Writing Stephen King says:

The use of simile and other figurative language is one of the chief delights of fiction—reading it and writing it, as well. When it’s on target, a simile delights us in much the same way meeting an old friend in a crowd of strangers does. By comparing two seemingly unrelated objects—a restaurant bar and a cave, a mirror and a mirage—we are sometimes able to see an old thing in a new and vivid way. Even if the result is mere clarity instead of beauty, I think writer and reader are participating together in a kind of miracle. Maybe that’s drawing it a little strong, but yeah—it’s what I believe.

In How Fiction Works, James Wood says:

“Metaphor is analogous to fiction, because it floats a rival reality. It is the entire imaginative process in one move. If I compare the slates on a roof to an armadillo’s back, or – as I did earlier – the bald patch on the top of my head to a crop circle (or on very bad days, to the kind of flattened ring of grass that a helicopter’s blades make when it lands in a field), I am asking you to do what Conrad said fiction should make you do – see. I am asking you to imagine another dimension, to picture likeness. Every metaphor or simile is a little explosion of fiction within the lager fiction of the novel or story.

Again: that’s hard. And technology isn’t going to make it any easier to start thinking about metaphors, which is probably a precursor to writing in a way that uses metaphor deftly. Before you can do that, you’re probably going to need to recognize when other writers are doing it, and yet, while Stephenson says that he is twice, most students don’t pick up on it. This isn’t to blame them, by the way—a lot of my graduate seminars are still about what the writer actually says. Some of you are probably getting caught up on this discussion of metaphor and think that I’m really writing about how it’s important for students to learn, when this is only a subsidiary point supporting my main point about the place of technology in classrooms. Here’s Wood again on the subject of learning to read:

You only have to teach literature to realise that most young readers are poor noticiers. I know from my own old books, wantonly annotated twenty years ago when I was a student, that I routinely underlined for approval details and images and metaphors that strike me now as commonplace, while serenely missing things which now seem wonderful. We grow, as readers, and twenty-year-olds are relative virgins.

If he wasn’t a good noticer at 20, what hope is there for the rest of us? And how is having a laptop going to help someone become a better noticer? Consider too one other thing to notice: in “Turn On, Tune In, Veg Out,” Stephenson isn’t using any complex or convoluted vocabulary. His sentence structure isn’t very complex; there aren’t lots of nasty nested clauses you have to mentally sort out to figure out what’s being talked about, as there are often are in abstruse literary theory and philosophy. His piece isn’t hard to read. But it’s still evidently hard for many freshmen to understand. So I spend a lot of time working towards understanding, towards reading for detail, towards asking, “Where do you see that?” Technology isn’t going to help that process very much. It may even hurt it by offering a proliferating number of distractions: if you interrupt your reading of “Turn On, Tune In, Veg Out” four times for text messages and once for an e-mail, are you going to remember how Stephenson said “Twenty-eight years later, the vast corpus of “Star Wars” movies, novels, games and merchandise still has much to say about geeks – and also about a society that loves them, hates them and depends upon them” by the end?

I’m teaching honors students, which is easier than teaching standard classes, which is in turn easier than teaching in tough inner-city schools. So I don’t face the same challenges as some of the teachers mentioned in the NYT article. But sometimes I think about Daniel Singal’s Atlantic article, “The Other Crisis in American Education: A college professor looks at the forgotten victims of our mediocre educational system–the potentially high achievers whose SAT scores have fallen, and who read less, understand less of what they read, and know less than the top students of a generation ago.” As the subtitle implies, he argues that the best students aren’t as challenged as they once were. I can’t tell if he’s right or if he’s hearkening back to a mythical golden age, but I do think about his work sometimes when I see what’s going on around me: other grad students and professors want to watch movies in class, or they aren’t even focused on imparting and enhancing basic reading and writing skills—the same ones pretty much everyone needs. Are the strongest students really getting something out of their classes? Is the technology really helping? If not, could it be part of what’s actually causing “our mediocre educational system?” I’m not saying it does, but I am saying it’s worth pondering.

Still, I think the strongest thinkers and learners—the ones who are now running Google and Facebook, the ones who are now partners in law firms and building their own businesses—are doing fine. Better than ever, maybe. Generation X was supposed to be the slacker generation, but its members built large blocks of the Internet—the same set of technologies you’re almost certainly using to read this. But I wonder if there’s not a growing bifurcation between the people who are doing really well and the ones who aren’t. In income terms, that’s certainly true, but I wonder if it’s happening in intellectual terms too. Stephenson thinks so: “Nothing is more seductive than to think that we, like the Jedi, could be masters of the most advanced technologies while living simple lives: to have a geek standard of living and spend our copious leisure time vegging out.” But the people who spend all that time vegging out aren’t going to create whatever the next iPod and Facebook will be. And they won’t reap those rewards, either. They’re the ones who might be “learning less,” as Singal has it. The people who make the next iPod and Facebook will be the one who are focused on “geeking out” regarding important topics. The ones who will, I hope, have teachers—whether in honors or not—who are focused on the essential questions that imparting knowledge involves.

By the way, I’m not trying to beat up college freshmen—if I were, I wouldn’t have the empathy necessary to be good. A lot of college seniors are little better than my freshmen, which I found out by working for Steven Klein at the Steven Klein LSAT Company. The LSAT is mostly a test of reading. If you can read effectively, you’ll do pretty well. But a lot of 22 – 24-year-old college graduates had a lot of trouble on reading comprehension because they couldn’t or hadn’t been trained to look at every word, evaluate it in relation to other words and in relation to the context of the passage, and understand what it means. I think back to those experiences when I read books like Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses or articles like the one about cool whizzy tech stuff in classrooms. The whizzy tech stuff isn’t going to help readers when they’re facing the LSAT.

A second “by the way” is in order: I’m neither trying to denigrate technology nor be a luddite—I say so as the guy typing on a fancy keyboard, ergonomic chair, and 27″ iMac, with a bunch of Textmate windows open. Computers make the mechanical process of writing easier, so that the hard stuff—the stuff that goes on in the mind—can dominate. Technology is great—in its place. The University of Arizona has computers and projectors and other neat stuff in many classrooms, and if that neat stuff is available, I use it.

But technology complements other skills; it doesn’t substitute for them. You can only use computers effectively to the extent you can read, write, and do simple math effectively—try programming without algebra. Or try to extract information from man pages without strong reading comprehension skills; hell, I like to imagine myself as being at least moderately literate, and I find some of them tough. So this is not one of those tedious essays in which Old Man Withers shakes his cane and complains about the kids with those damn beeping gizmos and sending those darned pictures of each others’ privates around and get off my damn lawn. Plus, I’m too young to shake my cane; I ran a modest but real number of miles yesterday. Even when I do have a cane someday, I hope that it 1) has a hidden sword, because that kind of thing is cool and 2) that I haven’t ossified to the point where I’m not willing to learn new things.

But this is an essay that points out how basic skills and the means of imparting those basic skills haven’t changed so much, as Amanda Ripley’s Atlantic article, “What Makes a Great Teacher?” makes clear in its discussion of what great teachers do:

First, great teachers tended to set big goals for their students. They were also perpetually looking for ways to improve their effectiveness. [. . .] Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.

Notice the thing absent from this list: use computers, iPads, and so forth. Sure, those great teachers could use technology, but they don’t need to. And the technology is not going to automatically make an indifferent teacher set big goals or recruit families or maintain focus or plan. Used poorly, it’s just going to provide some flash and pizazz and some distractions. Check out this Marginal Revolution discussion of a study looking at how introducing computers in poor households actually decreased student grades because students spent more time playing games on them than doing homework:

Not surprisingly, with all that game playing going on, the authors find that the voucher program actually resulted in a decline in grades although there was also some evidence for an increase in computer proficiency and perhaps some improvement in a cognitive test.

See also “Computers at Home: Educational Hope vs. Teenage Reality:” “Students posted significantly lower math test scores after the first broadband service provider showed up in their neighborhood, and significantly lower reading scores as well when the number of broadband providers passed four.” These reports should give technology cheerleaders pause: you aren’t going to get better results simply by lashing a computer on a teacher’s back and telling him to use it.*

To be a good teacher, you still need that weird skill- and mindset mentioned above. If you don’t have it or aren’t willing to develop it, I doubt anything else imposed on an individual teacher from the outside, like mandates to use technology, are going to do much for that teacher or for his or her students. If you want to really improve teaching, you’ll need to take an approach similar to the one Facebook and Google take to hiring hackers, which means a relentless focus not on degrees that offer dubious value in predicting achievement but on finding the best people and making sure they stay. Finding the best teachers is different from finding programmers—you probably can’t tell who’s going to be a good teacher before they hit the classroom—but you can at least acknowledge that you’re not going to get people who are good merely by saying, “use iPads in the classroom.” Steve Jobs and Bill Gates didn’t have iPads in their classrooms growing up, and maybe that’s part of what made Jobs able to have the vision necessary to Turn On, Geek Out, and make the iPad.

* I had a computer in middle and early high school that I used to master Starcraft and various other computer games, until I somehow realized I was wasting my life and smashed my Starcraft disks in the driveway. I sometimes use this analogy when I explain the situation to friends: some people can handle snorting the occasional line of coke without getting addicted; it’s just a fun way of spending a Saturday night. Some people can handle computer games in the same way. I discovered, at the time, that I’m not one of them, and, worse, I’ll never get those three or so wasted years back. Now I tend to find video games boring on average and can’t play for longer than half an hour to an hour at a stretch, while I’ve trained myself up to being able to write effectively for three to six hours at a time. The first draft of this essay, for example, took me about two hours.