Why Walter Isaacson Wanted to Make Alan Turing Famous

Patrice Gilbert

The recent film The Imitation Game stars Benedict Cumberbatch as Alan Turing, a British mathematical genius who helped the Allies win World War II by working to break the German Enigma code. After the war Turing was persecuted for his homosexuality, and subjected to cruel and degrading treatment that led him to take his own life. Last year Turing received a posthumous pardon from the Queen, and his legacy endures in such areas as mathematics, computer science, and artificial intelligence. One of his admirers is Walter Isaacson, whose new book The Innovators profiles Turing and other digital pioneers.

“One of the reasons I wrote this book is because I wanted to make people like Alan Turing famous,” Isaacson says in Episode 131 of the Geek’s Guide to the Galaxy podcast. “And now I must admit that Benedict Cumberbatch, by playing him, has done that a thousand times better than I ever could have.”

Isaacson is famous for his biographies of such figures as Benjamin Franklin, Albert Einstein, and Steve Jobs. But lately he’s come to feel that the biography format puts too much emphasis on individual personalities. The Innovators tries to show that great breakthroughs mostly come from team efforts, something The Imitation Game conveys very well.

“What the movie does show clearly is that Turing comes to the realization that you can’t do it alone, you’ve got to collaborate and be part of a team,” Isaacson says.

Isaacson hopes the film will inspire audiences to seek out more information about the real-life story of Turing, whether that means turning to The Innovators or to other works of nonfiction such as Alan Turing: The Enigma by Andrew Hodges.

“The movie does get to some real truths by taking literary license, but also the real story of Alan Turing is just a beautiful, heroic, and tragic story,” he says.

Listen to our complete interview with Walter Isaacson in Episode 131 of the Geek’s Guide to the Galaxy podcast (above), in which he discusses the work of Alan Turing and other digital pioneers, and check out some highlights from the discussion below.

“She was Lord Byron’s daughter, and thus she was kind of poetic, but her mother was a mathematician, so she developed what she called ‘poetical science,’ and she loved looking at how punchcards were instructing the looms of industrial England in the 1830s to make beautiful patterns. She had a friend, Charles Babbage, who was making a numerical calculator, and she realized that with punch cards that calculator could do anything—art, music, words, as well as numbers. And so to me she’s a patron saint of the revolution. … So I think that women have been at the forefront of pioneering the art of programming, but they’ve been written out of histrory a bit, and they really haven’t had as much of a role since then as they should have. … My daughter first introduced me to the importance of Ada Lovelace, because she was 15 and a computer geek, and she said that the only computer programmer who was a woman she’d ever heard of was Oracle in the Batman comics. And then she heard of Ada Lovelace, so she got excited, because she realized that real women could be programmers.”

Walter Isaacson on the creation of the Internet:

“When I was at Time magazine, we wrote the story that it was done to survive a nuclear attack, and we got a letter from Steve Crocker, who was in charge of what was called the ‘Request for Comments’—these were the ideas and rules and protocols for doing the Internet. And he sent us a letter saying, ‘No, that’s not why the Internet was created. It was created because we wanted to decentralize control over it.’ And Time magazine was very arrogant back in those days, so it sent a letter back to Steve Crocker saying, ‘No, we’re not going to print your letter, because we have better sources than you about why it was done.’ And I thought, ‘Well, that’s ridiculous.’ But when I was doing this book I still had the right to go back rummaging through the archives at Time magazine, and I tried to find out who was the better source—it turned out to have been Steve Lukasik, who had become the head of ARPAnet, and Steve Lukasik said, indeed, that’s how he got the money from the colonels in the Pentagon, or Congress, by emphasizing it would survive a Russian attack. And he said, ‘You can tell Steve Crocker that he was on the bottom and I was on the top, so he didn’t really know what was happening.’ When I sat and had coffee with Steve Crocker, interviewing him for this book, I told him that, and he strokes his chin, and he said, ‘You can tell Steve Lukasik that I was on the bottom and he was on the top, so he didn’t know what was happening.'”

“It got a little annoying after a while, because people would laugh and think, ‘Ha ha, what an original joke.’ And so I did do a bit on why Al Gore was important. When I was running digital management for Time magazine in the early 1990s, you could not as an average person go right onto the Internet. You could only go on the Internet if you were part of a university or a research group, something like that. And in 1992, Al Gore passes the Gore Act of 1992, which opens up the Internet so that anybody who can dial up with a modem and get to an online service like AOL or CompuServe or Prodigy, or just wants to dial up, can go directly onto the Internet. This transforms the digital revolution. It makes it not just a network of research centers, but it makes it into the Internet we have today. At that time, speaking of WIRED and Time magazine, Louis Rossetto and I were friends—he had founded WIRED—and we were both on AOL and CompuServe, these proprietary services. And it was in late 1993, I remember talking to him about, ‘Why don’t we go directly onto the Internet?’ Especially since the World Wide Web had been developed by Tim Berners-Lee, which made it easier to navigate to places on the Internet. And that was a big transforming thing that happens in 1992-1994 where the number of websites goes from zero to 10,000 in one year, and it’s largely because of the Gore Act of 1992, which opens up the Internet to the general public.”

Walter Isaacson on artificial intelligence:

“It always seems to be 20 years away. In fact, at the beginning of this year, if you just search it, you’ll find stories in the New York Times saying that neuromorphic chips are being developed that’ll mimic the human mind, and in 20 years we’ll have artificial intelligence. It always seems to be a bit of a mirage, and it always seems that things like Google or Wikipedia that combine human creativity with machine power always make greater advances than machine power alone does. … This is something that Gary Kasparov figures out when he gets beaten by the IBM machine Deep Blue. He decides to create a contest in which humans working with computers can play either the best computer or against the best human grand master. And in all of these contests, the combination of the human and machine—even if it’s amateur players working with laptop machines—tends to beat the grand master or the best computer. And this is a game—chess—which you have to remember is simply an algorithmic rule-driven game, so eventually computers should be able to crack that totally. On far more complicated things like ‘Should the NSA be allowed to eavesdrop?’ that’s a question I don’t think machines will ever be able to answer as well as a combination of machines and humans could.”