Posts Tagged ‘Irving Wallace’

Earlier this month, within the space of less than a day, two significant events occurred in the life of Donna Strickland, an assistant professor at the University of Waterloo. She won the Nobel Prize in Physics, and she finally got her own Wikipedia page. As the biologist and Wikipedia activist Dawn Bazely writes in an excellent opinion piece for the Washington Post:

The long delay was not for lack of trying. Last May, an editor had rejected a submitted entry on Strickland, saying the subject did not meet Wikipedia’s notability requirement. Strickland’s biography went up shortly after her award was announced. If you click on the “history” tab to view the page’s edits, you can replay the process of a woman scientist finally gaining widespread recognition, in real time.

And it isn’t an isolated problem, as Bazely points out: “According to the Wikimedia Foundation, as of 2016, only 17 percent of the reference project’s biographies were about women.” When Bazely asked some of her students to create articles on women in ecology or the sciences, she found that their efforts frequently ran headlong into Wikipedia’s editing culture: “Many of their contributions got reversed almost immediately, in what is known as a ‘drive-by deletion’…I made an entry for Kathy Martin, current president of the American Ornithological Society and a global authority on arctic and alpine grouse. Almost immediately after her page went live, a flag appeared over the top page: ‘Is this person notable enough?’”

Strickland’s case is an unusually glaring example, but it reflects a widespread issue that extends far beyond Wikipedia itself. In a blog post about the incident, Ed Erhart, a senior editorial associate at the Wikimedia foundation, notes that the original article on Strickland was rejected by an editor who stated that it lacked “published, reliable, secondary sources that are independent of the subject.” But he also raises a good point about the guidelines used to establish academic notability: “Academics may be writing many of the sources volunteer Wikipedia editors use to verify the information on Wikipedia, but they are only infrequently the subject of those same sources. And when it does occur, they usually feature men from developed nations—not women or other under-represented groups.” Bazely makes a similar observation:

We live in a world where women’s accomplishments are routinely discounted and dismissed. This occurs at every point in the academic pipeline…Across disciplines, men cite their own research more often than women do. Men give twice as many academic talks as women—engagements which give scholars a chance to publicize their work, find collaborators and build their resumes for potential promotions and job offers. Female academics tend to get less credit than males for their work on a team. Outside of academia, news outlets quote more male voices than female ones—another key venue for proving “notability” among Wikipedia editors. These structural biases have a ripple effect on our crowdsourced encyclopedia.

And this leads to an undeniable feedback effect, in which the existing sources used to establish notability are used to create Wikipedia articles, when serve as evidence of notability in the future.

Bazely argues that articles on male subjects don’t seem to be held to the same high standards as those for women, which reflects the implicit biases of its editors, the vast majority of whom are men. She’s right, but I also think that there’s a subtle historical element at play. Back during the wild west days of Wikipedia, when the community was still defining itself, the demographics of its most prolific editors were probably even less diverse than they are now. During those formative years, thousands of pages were generated under a looser set of standards, and much of that material has been grandfathered into the version that exists today. I should know, because I was a part of it. While I may not have been a member of the very first generation of Wikipedia editors—one of my friends still takes pride in the fact that he created the page for “knife”—I was there early enough to originate a number of articles that I thought were necessary. I created pages for such people as Darin Morgan and Julee Cruise, and when I realized that there wasn’t an entry for “mix tape,” I spent the better part of two days at work putting one together. By the standards of the time, I was diligent and conscientious, but very little of what I did would pass muster today. My citations were erratic, I included my own subjective commentary and evaluations along with verifiable facts, and I indulged in original research, which the site rightly discourages. Multiply this by a thousand, and you get a sense of the extent to which the foundations of Wikipedia were laid by exactly the kind of editor in his early twenties for whom writing a cultural history of the mix tape took priority over countless other deserving subjects. (It isn’t an accident that I had started thinking about mix tapes again because of Nick Hornby’s High Fidelity, which provides a scathing portrait of a certain personality type, not unlike my own, that I took for years at face value.)

And I don’t even think that I was wrong. Wikipedia is naturally skewed in favor of the enthusiasms of its users, and articles that are fun to research, write, and discuss will inevitably get more attention. But the appeal of a subject to a minority of active editors isn’t synonymous with notability, and it takes a conscious effort to correct the result, especially when it comes to the older strata of contributions. While much of what I wrote fifteen years ago has been removed or revised by other hands, a lot of it still persists, because it’s easier to monitor new edits than to systematically check pages that have been around for years. And it leaves behind a residue of the same kinds of unconscious assumptions that I’ve identified elsewhere in other forms of canonization. Wikipedia is part of our cultural background now, invisible and omnipresent, and we tend to take it for granted. (Like Google, it can be hard to research it online because its name has become a synonym for information itself. Googling “Google,” or keywords associated with it, is a real headache, and looking for information about Wikipedia—as opposed to information presented in a Wikipedia article—presents many of the same challenges.) And nudging such a huge enterprise back on course, even by a few degrees, doesn’t happen by accident. One way is through the “edit-a-thons” that often occur on Ada Lovelace Day, which is named after the mathematician whose posthumous career incidentally illustrates how historical reputations can be shaped by whoever happens to be telling the story.We think of Lovelace, who worked with Charles Babbage on the difference engine, as a feminist hero, but as recently as the early sixties, one writer could cite her as an example of genetic mediocrity: “Lord Byron’s surviving daughter, Ada, what did she produce in maturity? A system for betting on horse races that was a failure, and she died at thirty-six, shattered and deranged.” The writer was the popular novelist Irving Wallace, who is now deservedly forgotten. And the book was a bestseller about the Nobel Prize.

“You should only read what is truly good or what is frankly bad,” Gertrude Stein once told the young Ernest Hemingway. It was Paris in the early twenties, and Hemingway had just confessed that he had been reading Aldous Huxley, whom Stein contemptuously described as “a dead man.” (In fact, Huxley was still alive, and he would go on living for decades, surviving Hemingway himself by more than two years.) But it isn’t hard to guess what she meant by this. In his memoir A Moveable Feast, Hemingway recalls that he had been reading Huxley, D.H. Lawrence, and other writers “to keep my mind off writing sometimes after I had worked.” When Stein asked why he even bothered, his reply was a simple one: “I said that his books amused me and kept me from thinking.” And her response—that he should read only the truly good or frankly bad—strikes me as genuinely useful. On the one hand, we can’t subsist entirely on a diet of great books, and there are times when we justifiably read to avoid thinking, or to keep our minds off the possibility of writing for ourselves. Anything else would destroy us. On the other hand, the danger of reading what Stein called “inflated trash” is that we’ll lose the ability to distinguish between fake value and the real thing. When we don’t have the time or energy to fully engage with a book, it might be better to stick with something that we know is frankly bad, so we don’t waste time trying to make the distinction.

Personally, I’ve learned a lot from works of literature that occupy the middle ground between mediocrity and greatness, but I’ve also found myself unapologetically seeking out books that are frankly bad. They aren’t even great trash, as Pauline Kael might have put it, but trash of the most routine, ordinary kind. The most obvious example is my fascination with the novels of Arthur Hailey and Irving Wallace, two men who were among the bestselling writers of the sixties and seventies, only to be almost entirely forgotten now. Yet I keep reading them, and I can rarely resist picking up their books whenever I see one at a thrift store, which is where most of them seem to have ended up. (As I type this, I’m looking at the back cover of Wallace’s The Prize, which is described by its jacket copy as “one of the most compelling bestselling novels of all times.” As far as I can tell, it’s long out of print, along with all of Wallace’s other novels.) I particularly like them on long plane rides, when I’m too tired or distracted to focus on anything at all, and I can skim dozens of pages without any fear of missing anything important. On a recent trip to Europe, I carried so many of these books in my bag that it set off some kind of special alarm at security—the sensors evidently detected an unusual amount of “organic material,” in the form of yellowing mass market paperbacks. And when the security agent pulled out my flaking copies of The Prize and Hailey’s Overload, I felt like a confused time traveler with very bad taste.

This isn’t the place for a full consideration of either writer, but I feel obliged to share a few passages that might help to explain what they mean to me. Here’s my favorite line from Hailey’s Airport:

In the Cloud Captain’s Coffee Shop, Captain Vernon Demerest ordered tea for Gwen, black coffee for himself. Coffee—as it was supposed to do—helped keep him alert; he would probably down a dozen more cups between here and Rome.

As I’ve noted here before, another writer might have written, “He would probably down a dozen more cups between here and Rome,” trusting that the average reader would know that people sometimes drink coffee to stay awake. An author who wanted to be perfectly clear might have added, “Coffee helped keep him alert.” But only Hailey would have written “as it was supposed to do.” As for Wallace, take the moment in The Prize when a distinguished scientist contemplates cheating on her husband with a younger colleague:

Lindblom discoursed with nervous enthusiasm about the work in progress. His love for algae strains and soybean nodules and Rhodophyceae and Chlorella dinned on her eardrums…Trailing Lindblom, she peered at her watch. She had arrived at 11:05. It was now 11:55. The zero hour that she had set herself loomed close. The ultimate decision. Question One: Should she do it? There were two courses open: (a) mild flirtation, a holding of hands, an embrace, a kiss, romantic whispering, to be followed by similar meetings devoted to the same and no more; or (b) sexual intercourse.

That’s a big load of organic material. Yet it also wouldn’t be quite right to say that I’m reading these writers “ironically.” I view them totally without affection, and I don’t gain any cultural cachet by being seen with them on an airplane. You could even argue that I’m guilty of a weird reverse snobbism by reading books that aren’t beloved by anyone, but I prefer to think of it as a neat act of triangulation. The real risk of spending time with “frankly bad” books is that you’ll either dull your own taste or turn your default mode as a reader into one of easy condescension. I’ve found that Hailey and Wallace allow me to indulge my need for bad books in the least harmful way possible. Both authors are long dead, so their feelings can no longer be hurt. They were smart men who made enormous amounts of money by aiming squarely at the mainstream, and they clearly knew what they were doing. These weren’t cult books, but novels that millions of readers bought and promptly forgot. Neither left a devoted following, and they’ve dated so badly that they can barely be endured even as period pieces. But they’re still readable in their own way, and they can hardly be mistaken for anything except what they are. For all their attempts to inject sex and scandal into their Parade magazine view of the world, they’re the most complacent books imaginable, and I could even argue that they tell us something valuable about the complacency of their original readers. But that would be taking it too far. They amuse me and keep me from thinking—as they were supposed to do.

Note: I’m counting down ten books that have influenced the way that I think about the creative process, in order of the publication dates of their first editions. It’s a very personal list that reflects my own tastes and idiosyncrasies, and I’m always looking for new recommendations. You can find the earlier installments here.

For reasons that aren’t too hard to figure out, the most comprehensive accounts that we have of the creative process tend to focus on mediocre works of art. Since the quality of the result is out of anyone’s hands, you can’t expect such extensive documentation to coincide with the making of a masterpiece, and the artists who are pushing the boundaries of the medium are often too busy to keep good notes. (One possible exception is the bonus material for The Lord of the Rings, although you more typically end up with the endless hours of special features for The Hobbit.) This is why the most interesting book that I’ve ever seen about writing and publishing is The Writing of One Novel by Irving Wallace, which tells you more than you would ever want to know about his justly forgotten bestseller The Prize. It’s also why my single favorite book about filmmaking is Behind the Seen by Charles Koppelman, which centers on Walter Murch, an undeniable genius, and his editing of the film Cold Mountain. Even at the time, the movie found few passionate defenders, and watching the first half again recently didn’t change my mind. But the book that resulted from it is amazing. The critic David Thomson called it “probably the subtlest and most tender account of what a craftsman brings to a motion picture ever written,” but it’s also much more. From the moment that I first learned that it existed, I knew that I had to have it, and ever since, my copy—autographed by Murch himself—has occupied an unusual role in my writing life. It’s the book that I read whenever I need to revise a draft, get editorial feedback, or do anything else that frightens me as a writer. This is partially because I value Murch’s perspective, and because the craft of film editing has surprising affinities to what a writer does during the revision stage. Above all, however, it’s because this may be the most complete chronicle in existence of any act of creation whatsoever, from start to finish, and its wisdom is inseparable from its accumulation of ordinary detail over three hundred dense pages.

Behind the Seen is an unforgettable experience in itself, and I can’t recommend it highly enough. Yet it also contains detachable pieces of lore, advice, and insight that anyone can take to heart. There’s Koppelman’s discussion of the “little people,” the tiny paper silhouettes that Murch attaches to his television monitor to remind himself of the size of the movie screen. Or there’s Murch’s lovely analogy of “blinking the key,” in which a lesson drawn from lighting a set tells you what happens when you take away what seemed like an indispensable element. And then there’s this:

Murch also has his eye on what he calls the “thirty percent factor”—a rule of thumb he developed that deals with the relationship between the length of the film and the “core content” of the story. In general, thirty percent of a first assembly can be trimmed away without affecting the essential features of the script: all characters, action, story beats will be preserved and probably, like a good stew, enhanced by the reduction in bulk. But passing beyond the thirty percent barrier can usually be accomplished only by major structural alterations: the reduction or elimination of a character, or whole sequences—removing vital organs rather than trimming fat. “It can be done,” says Murch, “and I have done it on a number of films that turned out well in the end. But it is tricky, and the outcome is not guaranteed—like open-heart surgery. The patient is put at risk, and the further beyond thirty percent you go, the greater the risk.

Perhaps best of all, there’s the shiny brass “B” that Murch hangs in his office. Koppelman explains: “Ask Walter about it, and he’ll tell you about aiming for a ‘B.’ Work hard to get the best grade you can—in this world, a B is all that is humanly attainable. One can be happy with that. Getting an A? That depends on good timing and the whims of the gods—it’s beyond your control. If you start to think that the gods are smiling, they will take your revenge. Keep your blade sharp. Make as good a film as you know how.”

As I write this post, my wife is about fifty pages away from finishing The Royal We, a novel that she devoured over the course of the last few days like a bottomless bag of popcorn. I’ve only glanced at the book, but I’ve been impressed by what little of it I’ve seen, starting with the title, which is the kind of clever play on words—while also telling you exactly what the story is about—that could sell a hundred thousand copies in itself. It’s about a college student who meets, falls in love with, and finally marries the Prince of Wales, and if the plot sounds a touch familiar, that’s precisely the point. The Royal We isn’t exactly about Kate Middleton: its protagonist is American, for one thing, and the story diverges from the facts of the most famous public courtship in recent memory in small but meaningful ways. But like Curtis Sittenfeld’s American Wife, another book my wife loved, it’s a novel that all but begs us to fill in the blanks. And although it’s clearly written with taste and skill, it’s also a marketer’s dream. At a time when publishers are struggling to create new brands, the equivalent of high-class celebrity fanfic is as good a way as any to catch a reader’s eye. (Sometimes it doesn’t even need to be especially high class: an erotic fan novel about Harry Stiles of One Direction is being made into a movie as we speak.)

But what sets such recent books apart from prior efforts in the same line is how cheerfully they disclose their sources of inspiration. The roman à clef is as old, in one form or another, as the novel itself, but it really came into its own with the works of writers like Harold Robbins and Jacqueline Susann—”The giants,” as Spock calls them in The Voyage Home—whose novels were explicitly designed to encourage readers to put famous faces to lightly fictionalized names. As Dean Koontz said years ago in Writing Popular Fiction:

[A roman à clef is] a story in which all the characters seem to be allusions to real people—preferably quite famous people—and to real events the reader may have read of in newspapers and magazines; this establishes a celebrity guessing game among readers and reviewers that strengthens the illusion that you are telling of genuine events and, not incidentally, increases the book’s sales…In actuality, the [novel] bears only passing resemblance to the real lives of the personalities mentioned, but the reader likes to feel that he is getting the whole, ugly story firsthand.

And it’s worth noting how hard the novel, like a con artist “accidentally” displaying a briefcase full of cash to a mark, has to work to give the reader a winking nudge about how it should be read, while superficially acting as if it’s trying to keep a secret. The book needs to insist that names have been changed to protect the innocent, even as it makes its reference points obvious, and it demands a tricky balance. Too obscure, and we won’t make the connection at all; too transparent, and we’ll reject it as fantasy. (I’ll leave aside the example of Irving Wallace, one of Robbins and Susann’s contemporaries, who wasn’t above explicitly stating his sources in the text. In The Plot, a scandal involving a character clearly based on Christine Keeler is described as “ten times more exciting than the old Profumo affair,” while in The Fan Club, a pulpy novel about the kidnapping of a famous movie star, a character comes right out and says: “Picture Elizabeth Taylor or Marilyn Monroe or Brigitte Bardot lying in the next room naked.”) The Royal We and American Wife, although less coy, pull off much the same feat by selectively altering a few recognizable elements, as if industriously disguising their source material while implicitly keeping the spirit unchanged.

The result, if done correctly, offers an easy form of subtext, making the novel somewhat more interesting in ways that have little to do with craft. It’s a temptation to which I haven’t been entirely immune: City of Exiles includes a character so manifestly based on Garry Kasparov that I seriously considered just putting him in the story outright, as Frederick Forsyth did with everyone from Margaret Thatcher to Simon Wiesenthal. (If I chickened out in the end, it was mostly because I felt queasy about making the real Kasparov the target of an assassination attempt.) And it’s such a powerful trick that it gives pause to some novelists. In the afterword to Harlot’s Ghost, Norman Mailer writes:

In the course of putting together this attempt, there was many a choice to make on one’s approach to formal reality. The earliest and most serious decision was not to provide imaginary names for all the prominent people who entered the work. After all, that rejected approach would have left one with such barbarisms as James Fitzpatrick Fennerly, youngest man ever elected President of the United States.

Mailer goes on to note that if he’d given us, say, Howard Hunt under an assumed name, the reader would think: “This is obviously Howard Hunt. Now I’ll get to see what made him tick.” By giving us Hunt without a mask, the reader is free to say: “That isn’t my idea of Howard Hunt at all.” And that might even be the most honorable approach, even if it isn’t likely to thrill publishers, or their lawyers.

Note: To celebrate the third anniversary of this blog, I’ll be spending the week reposting some of my favorite pieces from early in its run. This post originally appeared, in a somewhat different form, on December 17, 2010.

As the New York Times recently pointed out, Google’s new online book database, which allows users to chart the evolving frequency of words and short phrases over 5.2 million digitized volumes, is a wonderful toy. You can look at the increasing frequency of George Carlin’s seven dirty words, for example—not surprisingly, they’ve all become a lot more common over the past few decades—or chart the depressing ascent of the word “alright.” Most seductively of all, perhaps, you can see at a glance how literary reputations have risen or fallen over time.

Take the five in the graph above, for instance. It’s hard not to see that, for all the talk of the death of Freud, he’s doing surprisingly well, and even passed Shakespeare in the mid-’70s (around the same time, perhaps not coincidentally, as Woody Allen’s creative peak). Goethe experienced a rapid fall in popularity in the mid-’30s, though he had recovered nicely by the end of World War II. Tolstoy, by contrast, saw a modest spike sometime around the Big Three conference in Tehran, and a drop as soon as the Soviet Union detonated its first atomic bomb. And Kafka, while less popular during the satisfied ’50s, saw a sudden surge in the paranoid decades thereafter:

Obviously, it’s possible to see patterns anywhere, and I’m not claiming that these graphs reflect real historical cause and effect. But it’s fun to think about. Even more fun is to look at the relative popularity of five leading American novelists of the last half of the twentieth century:

The most interesting graph is that for Norman Mailer, who experiences a huge ascent up to 1970, when his stature as a cultural icon was at his peak (just after his run for mayor of New York). Eventually, though, his graph—like those of Gore Vidal, John Updike, Philip Roth, and Saul Bellow—follows the trajectory that we’d suspect for that of an established, serious author: a long, gradual rise followed by a period of stability, as the author enters the official canon. Compare this to a graph of four best-selling novelists of the 1970s:

For Harold Robbins, Jacqueline Susann, Irving Wallace, and Arthur Hailey—and if you don’t recognize their names, ask your parents—we see a rapid rise in popularity followed by an equally rapid decline, which is what we might expect for authors who were once hugely popular but had no lasting value. And it’ll be interesting to see what this graph will look like in fifty years for, say, Stephenie Meyer or Dan Brown, and in which category someone like Jonathan Franzen or J.K. Rowling will appear. Only time, and Google, will tell.

Over the last week or so, I’ve been reading The Inner Game of Tennis by W. Timothy Gallwey, which might seem a little strange for someone who hasn’t even held a tennis racket since his sophomore year of high school. I stumbled across it courtesy of another unlikely source: the online memoir Fade In by Michael Piller, which describes the late author’s experiences while writing the screenplay for Star Trek: Insurrection. (As an aside, I’ve always been struck by the fact that it’s often seemingly mediocre works of art, not acknowledged masterpieces, that provide us with the most detailed accounts of the creative process. The most insightful case study I’ve seen on the writing and publication of a specific book is Irving Wallace’sThe Writing of One Novel, about his potboiler The Prize, which, like Star Trek: Insurrection, doesn’t rank very highly on anyone’s all-time best list. In Piller’s case, the fact that the final product was ultimately forgettable reflects less on his script than on other, less controllable factors, and I suspect that his work might actually hold up better than more recent incarnations of the franchise.)

In any event, Piller’s book, which you can download here, is loaded with equal amounts of artistic insight and industry gossip, and I’d recommend it highly to anyone with even the slightest interest in how a script is written. I first read it several years ago, and on revisiting it recently, I came across Piller’s recommendation of The Inner Game of Tennis as a book that aspiring writers should read. He says:

In trying to counsel young writers, I actually tell them to read The Inner Game of Tennis to become familiar with the two selves. In the book, Gallwey suggests that within every player, there’s a Self 1 that seems to give instructions and make judgments (“Dammit, you idiot, keep your eye on the ball”) and another Self 2 that seems to perform the action. The book shows you ways to get Self 1 to give up control and trust Self 2 to perform successfully. It’s the difference between making it happen and letting it happen.

Piller goes on to suggest that writers might benefit from a similar approach while working on a novel or script, especially a first draft: instead of forcing the action into a particular direction, just let it happen as if you were watching the movie yourself.

I was intrigued enough by the description to pick up a copy of Gallwey’s book, and after reading it, I agree that it’s worth checking out. Tennis as a metaphor for creative activity isn’t that much more farfetched than any of the others I use on a regular basis—writing as design, as architecture, as a game—and it’s true that a large part of finishing a draft lies in silencing the critical Self 1. I was also struck by something that Gallwey says in the chapter titled “Master Tips.” He writes:

Master tips refers to certain key elements of a stroke which, if done properly, tend to cause many other elements to be done properly. By discovering the groove of these key elements of behavior there is little need to concern yourself with scores of secondary details…

Before beginning, let me simplify the external problem facing the tennis player. He faces only two requirements for winning any given point: each ball must be hit over the net and into his opponent’s court. The sole aim of stroke technique is to fulfill these two requirements with consistency and with enough pace and accuracy to keep pressure on one’s opponent.

Writing, I’ve found, works much the same way. If the external problem in tennis is to hit the ball over the net into the opponent’s court, the problem in writing lies in sustaining the reader’s interest, in what John Gardner calls “the vivid and continuous fictional dream.” Any writing tip or rule I’ve shared here is useful only to the extent that it furthers that goal, and, as in tennis, a few master tips often get you most of the way there. In both cases, however, it can be a mistake to consciously focus on the rules. Gallwey points out that once players start worrying about form, they tend to stiffen up, and the best way to avoid this is to observe your actions without judgment, focusing on keeping the result natural and relaxed. Similarly, I follow a lot of personal rules for writing fiction, but in practice, I try not to think about them when I’m working on a rough draft or finding a shape for a story. It’s part intuition, part experience, and when I do consciously invoke the rules, it’s only if I notice that the result is diverging from the intention. When it works, as in Gallwey’s ideal tennis game, it doesn’t feel as if I deserve any credit. The serve seems to serve itself, just as the story tells itself. And the best thing a player can do is keep from getting in the way.

For a certain kind of novelist, there’s an enormous temptation to base one’s characters on recognizable people, and many stories gain nearly all of their interest from the perception that they’re thinly veiled depictions of real public figures. As Dean Koontz points out in his dated but valuable book Writing Popular Fiction, works by the likes of Harold Robbins or Jacqueline Susann are compelling largely because we think we can guess who these rich, glamorous, oversexed characters are supposed to be, and we’re more likely to take the author’s portrait at face value precisely because the names have been changed: the novel implicitly promises to tell it like it is, without fear of libel, at least for readers who are clever enough to fit names to faces. Irving Wallace went even further, spelling out his sources in the text itself—and often on the back cover copy. As I’ve mentioned before, in a novel like The Plot, Wallace isn’t simply content to create a character based on Christine Keeler, but blandly tells us that her scandal was “ten times more exciting than the old Profumo affair.”

While this can be an effective fictional device, a lot of novelists resist it, and for good reason. Norman Mailer, in his afterword to Harlot’s Ghost, explains that his decision to incorporate real people into the narrative using their proper names arose from a desire to avoid this kind of phony authenticity:

It was obvious, therefore, that one would have to give Jack Kennedy his honest name…One could only strip him of his fictional magic by putting a false name on him; then the reader’s perception becomes no more than, “Oh, yes, President Fennerly is Jack Kennedy—now I will get to learn what made Jack Kennedy tick.”

As a result, Mailer uses the actual names of important characters like Howard Hunt, Allen Dulles, and Bill Harvey, knowing that the reader will naturally be more critical of how these men are portrayed, thinking, “That isn’t my idea of Howard Hunt at all.” And it’s also likely that Mailer, in writing in what amounts to an epic spy novel, was encouraged by the conventions of suspense fiction, in which real names are often used to give the action an air of verisimilitude. Frederick Forsyth, for example, populates his books with such historical figures as Kim Philby and Simon Wiesenthal, many of whom were still alive when these novels were written, allowing him to blur the line between fiction and reportage—which is a large part of his work’s appeal.

In The Icon Thief and its sequels, I’m operating in a similar mode, and I’ve occasionally run into the problem of whether or not to use the real names of living people. (I’m much less concerned about historical figures, whom I tend to name freely, even as I indulge in other forms of speculation or invention.) President Putin never appears directly in these books, but he’s frequently mentioned, and I decided long ago that it would be absurd to refer to him by any other name. I thought seriously about placing a real energy company at the center of the plot of City of Exiles, but I finally chickened out, reasoning that a fictional version would give me more narrative freedom in later installments. And for a long time, I considered making Garry Kasparov a major figure in the second novel. In the end, I didn’t, although there isn’t much doubt about which legendary chess grandmaster Victor Chigorin is supposed to represent. I changed the name partly to give me more flexibility in constructing the story, and also because I felt uncomfortable subjecting Kasparov to what ultimately happens to Chigorin, which isn’t pretty.

Besides, it’s usually more interesting when characters diverge from their original inspirations. I’ve mentioned before that Maddy and Ethan were loosely based on the real art world couple of Teresa Duncan and Jeremy Blake, although I doubt that many people would have made the connection. In Chapter 49, however, when we finally learn what happened to Anzor Archvadze—who has been missing in action for much of the novel’s second half—I imagine that more than a few readers were immediately reminded of Alexander Litvinenko. The two cases are very different, of course: Litvinenko died of radiation poisoning, while Archvadze is dying of toxic epidermal necrolysis, which bears a greater resemblance to another mysterious death in Russia. Still, I hope that readers do think of Litvinenko, not so much in order to capitalize on the parallels to a real event than out of a desire to remind them of how much like a novel the truth can be. Litvinenko’s death was often compared to something out of a spy thriller, but it was horribly real. And as farfetched as Archvadze’s fate might seem, reality is far stranger…