Posts Tagged ‘Marcel Proust’

In March, the graphic artist Susan Kare, who is best known for designing the fonts and icons for the original Apple Macintosh, was awarded a medal of recognition from the professional organization AIGA. It occurred to me to write a post about her work, but when I opened a gallery of her designs, I found myself sidetracked by an unexpected sensation. I felt happy. Looking at those familiar images—the Paintbrush, the Trash Can, even the Bomb—brought me as close as I’ve come in a long time to what Proust describes after taking a bite of the madeleine in the first volume of In Search of Lost Time:

Just as the Japanese amuse themselves by filling a porcelain bowl with water and steeping in it little crumbs of paper which until then are without character or form, but, the moment they become wet, stretch themselves and bend, take on color and distinctive shape, become flowers or houses or people, permanent and recognizable, so in that moment all the flowers in our garden…and the good folk of the village and their little dwellings and the parish church and the whole of Combray and of its surroundings, taking their proper shapes and growing solid, sprang into being, town and gardens alike, from my cup of tea.

In my case, it wasn’t a physical location that blossomed into existence, but a moment in my life that I’ve tried repeatedly to evoke here before. I was in my early teens, which isn’t a great period for anyone, and I can’t say that I was content. But for better or worse, I was becoming whatever I was supposed to be, and throughout much of that process, Kare’s icons provided the inescapable backdrop.

You could argue that nostalgia for computer hardware is a fairly recent phenomenon that will repeat itself in later generations, with children who are thirteen or younger today feeling equally sentimental toward devices that their parents regard with indifference—and you might be right. But I think that Kare’s work is genuinely special in at least two ways. One is that it’s a hallmark of perhaps the last time in history when a personal computer could feel like a beguiling toy, rather than an indispensable but utilitarian part of everyday life. The other is that her icons, with their handmade look and origins, bear the impression of another human being’s personality in ways that would all but disappear within a few years. As Alexandra Lange recounts in a recent profile of Kare:

In 1982, [Kare] was a sculptor and sometime curator when her high-school friend Andy Hertzfeld asked her to create graphics for a new computer that he was working on in California. Kare brought a Grid notebook to her job interview at Apple Computer. On its pages, she had sketched, in pink marker, a series of icons to represent the commands that Hertzfeld’s software would execute. Each square represented a pixel. A pointing finger meant “Paste.” A paintbrush symbolized “MacPaint.” Scissors said “Cut.” Kare told me about this origin moment: “As soon as I started work, Andy Hertzfeld wrote an icon editor and font editor so I could design images and letterforms using the Mac, not paper,” she said. “But I loved the puzzle-like nature of working in sixteen-by-sixteen and thirty-two-by-thirty-two pixel icon grids, and the marriage of craft and metaphor.”

That same icon editor, or one of its successors, was packaged with the Mac that I used, and I vividly remember clicking on that grid myself, shaping the building blocks of the interface in a way that seems hard to imagine now.

And Kare seems to have valued these aspects of her work even at the time. There’s a famous series of photos of her in a cubicle at Apple in 1984, leaning back in her chair with one New Balance sneaker propped against her desk, looking impossibly cool. In one of the pictures, if you zoom in on the shelf of books behind her, it’s possible to make out a few titles, including the first edition of Symbol Sourcebook by Henry Dreyfuss, with an introduction by none other than R. Buckminster Fuller. Kare has spoken highly of this book elsewhere, most notably in an interview with Alex Pang of Stanford, to whom she explained:

One of my favorite parts of the book is its list of hobo signals, that hobos used to contact each other when they were on the road. They look like they’re in chalk on stones…When you’re desperate for an idea—some icons, like the piece of paper, are no problem; but others defy the visual, like “undo”—you look at things like hobo signs. Like this: “Man with a gun lives here.” Now, I can’t say that anything in this book is exactly transported into the Macintosh interface, but I think I got a lot of help from this, just thinking. This kind of symbol appeals to me because it had to be really simple, and clear to a group of people who were not going to be studying these for years in academia. I don’t understand a lot of them—“These people are rich” is a top hat and a triangle—but I always had that at Apple. I still use it, and I’m grateful for it.

And it seems likely that this was the “symbol dictionary” in which Kare discovered the Bowen Knot, a symbol once used to indicate “interesting features” at Swedish campgrounds, which lives on as the Command icon on the Mac.

According to Kare, the Bowen Knot originally represented a castle with four turrets, and if you’re imaginative enough, you can imagine it springing into being from the keys to either side of the space bar, like the village from Proust’s teacup. Like the hobo signs, Kare’s icons are a system of signals left to those who might pass by in the future, and the fact that they’ve managed to survive at Apple in even a limited way is something of a miracle in itself. (As the tech journalist Mike Murphy recently wrote: “For whatever reason, Apple looks and acts far more like a luxury brand than a consumer-technology brand in 2018.” And there isn’t much room in that business for castles or hobo signs.) When you click through the emulated versions of the earliest models of the Macintosh on the Internet Archive, it can feel like a temporary return to those values, or like a visit to a Zen garden. Yet if we only try to recapture it, we miss the point. Toward the end of In Search of Lost Time, Proust experiences a second moment of revelation, when he stumbles in a courtyard and catches himself “on a flagstone lower than the one next it,” which reminds him of a similar sensation that he had once felt at the Baptistry of St. Mark in Venice. And what he says of this flash of insight reminds me of how I feel when I look at the Happy Mac, and all the possibilities that it once seemed to express:

As at the moment when I tasted the madeleine, all my apprehensions about the future, all my intellectual doubts, were dissipated. Those doubts which had assailed me just before, regarding the reality of my literary gifts and even regarding the reality of literature itself were dispersed as though by magic…Merely repeating the movement was useless; but if…I succeeded in recapturing the sensation which accompanied the movement, again the intoxicating and elusive vision softly pervaded me, as though it said, “Grasp me as I float by you, if you can, and try to solve the enigma of happiness I offer you.”

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on October 21, 2016.

It’s been said that all of the personal financial advice that most people need to know can fit on a single index card. In fact, that’s pretty much true—which didn’t stop the man who popularized the idea from writing a whole book about it. But the underlying principle is sound enough. When you’re dealing with a topic like your own finances, instead of trying to master a large body of complicated material, you’re better off focusing on a few simple, reliable rules until you aren’t likely to break them by mistake. Once you’ve internalized the basics, you can move on. The tricky part is identifying the rules that will get you the furthest per unit of effort. In practice, no matter what we’re doing, nearly all of us operate under only a handful of conscious principles at any given moment. We just can’t keep more than that in our heads at any one time. (Unconscious principles are another matter, and you could say that intuition is another word for all the rules that we’ve absorbed to the point where we don’t need to think about them explicitly.) If the three or four rules that you’ve chosen to follow are good ones, it puts you at an advantage over a rival who is working with an inferior set. And while this isn’t enough to overcome the impact of external factors, or dumb luck, it makes sense to maximize the usefulness of the few aspects that you can control. This implies, in turn, that you should think very carefully about a handful of big rules, and let experience and intuition take care of the rest.

Recently, I’ve been thinking about what I’d include on a similar index card for a writer. In my own writing life, a handful of principles have far outweighed the others. I’ve spent countless hours discussing the subject on this blog, but you could throw away almost all of it: a single index card’s worth of advice would have gotten me ninety percent of the way to where I am now. For instance, there’s the simple rule that you should never go back to read what you’ve written until you’ve finished a complete rough draft, whether it’s a short story, an essay, or a novel—which is more responsible than any other precept for the fact that I’m still writing at all. The principle that you should cut at least ten percent from a first draft, in turn, is what helped me sell my first stories, and in my experience, it’s more like twenty percent. Finally, there’s the idea that you should structure your plot as a series of objectives, and that you should probably make some kind of outline to organize your thoughts before you begin. This is arguably more controversial than the other two, and outlines aren’t for everybody. But they’ve allowed me to write more intricate and ambitious stories than I could have managed otherwise, and they make it a lot easier to finish what I’ve started. (The advice to write an outline is a little like the fifth postulate of Euclid: it’s uglier than the others, and you get interesting results when you get rid of it, but most of us are afraid to drop it completely.)

Then we get to words of wisdom that aren’t as familiar, but which I think every writer should keep in mind. If I had to pick one piece of advice to send back in time to my younger self, along with the above, it’s what David Mamet says in Some Freaks:

As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.

It isn’t as elegantly phased as I might like, but it gets at something so important about the writing process that I’ve all but memorized it. A real writer has to be good at everything, and it’s unclear why we should expect all those skills to manifest themselves in a single person. As I once wrote about Proust: “It seems a little unfair that our greatest writer on the subject of sexual jealousy and obsession should also be a genius at describing, say, a seascape.” How can we reasonably expect our writers to create suspense, tell stories about believable characters, advance complicated ideas, and describe the bedroom curtains?

The answer—and while it’s obvious, it didn’t occur to me for years—is that the writer doesn’t need to do all of this at once. A work of art is experienced in a comparative rush, but it doesn’t need to be written that way. (As Homer Simpson was once told: “Very few cartoons are broadcast live. It’s a terrible strain on the animators’ wrists.”) You do one thing at a time, as Mamet says, and divide up your writing schedule so that you don’t need to be clever and careful at the same time. This applies to nonfiction as well. When you think about the work that goes into writing, say, a biography, it can seem absurd that we expect a writer to be the drudge who tracks down the primary sources, the psychologist who interprets the evidence, and the stylist who writes it up in good prose. But these are all roles that a writer plays at different points, and it’s a mistake to conflate them, even as each phase informs all the rest. Once you’ve become a decent stylist and passable psychologist, you’re also a more efficient drudge, since you’re better at figuring out what is and isn’t useful. Which implies that a writer isn’t dealing with just one index card of rules, but with several, and you pick and choose between them based on where you are in the process. Mamet’s point, I think, is that this kind of switching is central to getting things done. You don’t try to do everything simultaneously, and you don’t overthink whatever you’re doing at the moment. As Mamet puts it elsewhere: “Keep it simple, stupid, and don’t violate the rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.”

Like this:

“What keeps science fiction a minor genre, for all the brilliance of its authors and apparent pertinence of its concerns?” The critic who asked this question was none other than John Updike, in his New Yorker review of David G. Hartwell’s anthology The World Treasury of Science Fiction, which was published at the end of the eighties. Updike immediately responded to his own question with his usual assurance:

The short answer is that each science-fiction story is so busy inventing its environment that little energy is left to be invested in the human subtleties. Ordinarily, “mainstream” fiction snatches what it needs from the contemporary environment and concentrates upon surprising us with details of behavior; science fiction tends to reverse the priorities…It rarely penetrates and involves us the way the quest realistic fiction can…”The writer,” Edmund Wilson wrote, “must always find expressions for something which has never yet been exposed, must master a new set of phenomena which has never yet been mastered.” Those rhapsodies, for instance, which Proust delivered upon the then-fresh inventions of the telephone, the automobile, and the airplane point up the larger relativities and magical connections of his great novel, as well as show the new century breaking upon a fin-de-siècle sensibility. The modest increments of fictional “news,” of phenomena whose presentation is unprecedented, have the cumulative weight of true science—a nudging, inching fidelity to human change ultimately far more impressive and momentous than the great glittering leaps of science fiction.

I’ll concede that Updike’s underlying point here is basically correct, and that a lot of science fiction has to spend so much time establishing the premise and the background that it has to shortchange or underplay other important qualities along the way. (At its highest level, this is less a reflection of the author’s limitations than a courtesy to the reader. It’s hard to innovate along every parameter at once, so complex works of speculative fiction as different as Gravity’s Rainbow and Inception need to strategically simplify wherever they can.) But there’s also a hidden fallacy in Updike’s description of science fiction as “a minor genre.” What, exactly, would a “major” genre look like? It’s hard to come up with a definitive list, but if we’re going to limit ourselves to a conception of genre that encompasses science fiction and not, say, modernist realism, we’d probably include fantasy, horror, western, romance, erotica, adventure, mystery, suspense, and historical fiction. When we ask ourselves whether Updike would be likely to consider any of these genres “major,” it’s pretty clear that the answer is no. Every genre, by definition, is minor, at least to many literary critics, which not only renders the distinction meaningless, but raises a host of other questions. If we honestly ask what keeps all genres—although not individual authors—in the minor category, there seem to be three possibilities. Either genre fiction fails to attract or keep major talent; it suffers from various systemic problems of the kind that Updike identified for science fiction; or there’s some other quirk in the way we think about fiction that relegates these genres to a secondary status, regardless of the quality of specific works or writers.

And while all three of these factors may play a role, it’s the third one that seems most plausible. (After all, when you average out the quality of all “literary fiction,” from Updike, Bellow, and Roth down to the work put out by the small presses and magazines, it seems fairly clear that Sturgeon’s Law applies here as much as anywhere else, and ninety percent of everything is crud. And modernist realism, like every category coherent enough to earn its own label, has plenty of clichés of its own.) In particular, if a genre writer is deemed good enough, his or her reward is to be elevated out of it entirely. You clearly see this with such authors as Jorge Luis Borges, perhaps the greatest writer of speculative fiction of the twentieth century, who was plucked out of that category to complete more effectively with Proust, Joyce, and Kafka—the last of whom was arguably also a genre writer who was forcibly promoted to the next level. It means that the genre as a whole can never win. Its best writers are promptly confiscated, freeing up critics to speculate about why it remains “minor.” As Daniel Handler noted in an interview several years ago:

I believe that children’s literature is a genre. I resisted the idea that children’s literature is just anything that children are reading. And I certainly resisted the idea that certain books should get promoted out of children’s literature just because adults are reading them. That idea is enraging too. That’s what happens to any genre, right? First you say, “Margaret Atwood isn’t really a science fiction writer.” Then you say, “There really aren’t any good science fiction writers.” That’s because you promoted them all!

And this pattern isn’t a new one. It’s revealing that Updike quoted Edmund Wilson, who in his essays “Why Do People Read Detective Stories” and “Who Cares Who Killed Roger Ackroyd?” dismissed the entire mystery genre as minor or worse. Yet when it came to defending his fondness for one author in particular, he fell back on a familiar trick:

I will now confess, in my turn, that, since my first looking into this subject last fall, I have myself become addicted, in spells, to reading myself to sleep with Sherlock Holmes, which I had gone back to, not having looked at it since childhood, in order to see how it compared with Conan Doyle’s latest imitators. I propose, however, to justify my pleasure in rereading Sherlock Holmes on grounds entirely different from those on which the consumers of the current product ordinarily defend their taste. My contention is that Sherlock Holmes is literature on a humble but not ignoble level, whereas the mystery writers most in vogue now are not. The old stories are literature, not because of the conjuring tricks and the puzzles, not because of the lively melodrama, which they have in common with many other detective stories, but by virtue of imagination and style. These are fairy-tales, as Conan Doyle intimated in his preface to his last collection, and they are among the most amusing of fairy-tales and not among the least distinguished.

Strip away the specifics, and the outlines of the argument are clear. Sherlock Holmes is good, and mysteries are bad, so Sherlock Holmes must be something other than mystery fiction. It’s maddening, but from the point of view of a working critic, it makes perfect sense. You get to hold onto the works that you like, while keeping the rest of the genre safely minor—and then you can read yourself happily to sleep.

I’ve been thinking a lot recently about my childhood. One of the inciting factors was the movie adaptation of Stephen King’s It, which I enjoyed a great deal when I finally saw it. It’s a blue-chip horror film, with a likable cast and fantastic visuals, and its creators clearly care as much about the original novel as I do. In theory, the shift of its setting to the late eighties should make it even more resonant, since this is a period that I know and remember firsthand. Yet it isn’t quite as effective as it should be, since it only tells the half of the story that focuses on the main characters as children, and most of the book’s power comes from its treatment of memory, childhood, and forgetfulness—which director Andy Muschietti and his collaborators must know perfectly well. Under the circumstances, they’ve done just about the best job imaginable, but they inevitably miss a crucial side of a book that has been a part of my life for decades, even if I was too young to appreciate it on my first reading. I was about twelve years old at the time, which means that I wasn’t in a position to understand its warning that I was doomed to forget much of who I was and what I did. (King’s uncanny ability to evoke his own childhood so vividly speaks as much as anything else to his talents.) As time passes, this is the aspect of the book that impresses me the most, and it’s one that the movie in its current form isn’t able to address. A demonic clown is pretty scary, but not as much as the realization, which isn’t a fantasy at all, that we have to cut ourselves off from much of who we were as children in order to function as adults. And I’m saying this as someone who has remained almost bizarrely faithful to the values that I held when I was ten years old.

In fact, it wouldn’t be farfetched to read Pennywise the Dancing Clown as the terrifying embodiment of the act of forgetting itself. In his memoir Self-Consciousness, John Updike—who is mentioned briefly in It and lends his last name to a supporting character in The Talisman—described this autobiographical amnesia in terms that could serve as an epigraph to King’s novel:

Not only are selves conditional but they die. Each day, we wake slightly altered, and the person we were yesterday is dead. So why, one could say, be afraid of death, when death comes all the time? It is even possible to dislike our old selves, these disposable ancestors of ours. For instance, my high-school self—skinny, scabby, giggly, gabby, frantic to be noticed, tormented enough to be a tormenter, relentlessly pushing his cartoons ad posters and noisy jokes and pseudo-sophisticated poems upon the helpless high school—strikes me now as considerably obnoxious, though I owe him a lot.

Updike sounds a lot here like King’s class clown Richie Tozier, and his contempt toward his teenage self is one to which most of us can relate. Yet Updike’s memories of that period seem slightly less vivid than the ones that he explored elsewhere in his fiction. He only rarely mined them for material, even as he squeezed most of his other experiences to the last drop, which implies that even Updike, our greatest noticer, preferred to draw a curtain of charity across himself as an adolescent. And you can hardly blame him.

I was reminded of this by the X-Files episode “The Lost Art of Forehead Sweat,” which is about nothing less than the ways in which we misremember our childhoods, even if this theme is cunningly hidden behind its myriad other layers. At one point, Scully says to Reggie: “None of us remember our high school years with much accuracy.” In context, it seems like an irrelevant remark, but it was evidently important to Darin Morgan, who said to Entertainment Weekly:

When we think back on our memories from our youth, we have a tendency—or at least I do—to imagine my current mindset. Whenever I think about my youth, I’m like, “Why didn’t I do this? Why didn’t I do that?” And then you drive by high school students and you go, “Oh, that’s why I didn’t do it. Because I was a kid.” You tend to think of your adult consciousness, and you take that with you when you’re thinking back on your memories and things you’ve done in the past. Our memories are sometimes not quite accurate.

In “Forehead Sweat,” Morgan expresses this through a weird flashback in which we see Mulder’s adult head superimposed on his preadolescent body, which is a broad visual gag that also gets at something real. We really do seem to recall the past through the lens of our current selves, so we’re naturally mortified by what we find there—which neatly overlooks the point that everything that embarrasses us about our younger years is what allowed us to become what we are now. I often think about this when I look at my daughter, who is so much like me at the age of five that it scares me. And although I want to give her the sort of advice that I wish I’d heard at the time, I know that it’s probably pointless.

Childhood and adolescence are obstacle courses—and occasional horror shows—that we all need to navigate for ourselves, and even if we sometimes feel humiliated when we look back, that’s part of the point. Marcel Proust, who thought more intensely about memory and forgetting than anybody else, put it best in Within a Budding Grove:

There is no man…however wise, who has not at some period of his youth said things, or lived in a way the consciousness of which is so unpleasant to him in later life that he would gladly, if he could, expunge it from his memory. And yet he ought not entirely to regret it, because he cannot be certain that he has indeed become a wise man—so far as it is possible for any of us to be wise—unless he has passed through all the fatuous or unwholesome incarnations by which that ultimate stage must be preceded…We are not provided with wisdom, we must discover it for ourselves, after a journey through the wilderness which no one else can take for us, an effort which no one can spare us, for our wisdom is the point of view from which we come at last to regard the world. The lives that you admire, the attitudes that seem noble to you are not the result of training at home, by a father, or by masters at school, they have sprung from beginnings of a very different order, by reaction from the influence of everything evil or commonplace that prevailed round about them. They represent a struggle and a victory.

I believe this, even if I don’t have much of a choice. My childhood is a blur, but it’s also part of me, and on some level, it never ended. King might be speaking of adolescence itself when he writes in the first sentence of It: “The terror…would not end for another twenty-eight years—if it ever did end.” And I can only echo what Updike wistfully says elsewhere: “I’ve remained all too true to my youthful self.”

Note: This post reveals plot details from last night’s episode of Twin Peaks.

One of the central insights of my life as a reader is that certain kinds of narrative are infinitely expansible or contractible. I first started thinking about this in college, when I was struggling to read Homer in Greek. Oral poetry, I discovered, wasn’t memorized, but composed on the fly, aided by the poet’s repertoire of stock lines, formulas, and images that happened to fit the meter. This meant that the overall length of the composition was highly variable. A scene that takes up just a few lines in the Iliad that survives could be expanded into an entire night’s recital, based on what the audience wanted to hear. (For instance, the characters of Crethon and Orsilochus, who appear for only twenty lines in the existing version before being killed by Aeneas, might have been the stars of the evening if the poet happened to be working in Pherae.) That kind of flexibility originated as a practical consequence of the oral form, but it came to affect the aesthetics of the poem itself, which could grow or shrink to accommodate anything that the poet wanted to talk about. Homer uses his metaphors to introduce miniature narratives of human life that don’t otherwise fit into a poem of war, and some amount to self-contained short stories in themselves. Proust operates in much the same way. One observation leads naturally to another, and an emotion or analogy evoked in passing can unfold like a paper flower into three dense pages of reflections. In theory, any novel could be expanded like this, like a hypertext that opens into increasingly deeper levels. In Search of Lost Time happens to be the one book in existence in which all of these flowerings have been preserved, with a plot could fit into a novella of two hundred unhurried pages.

Something similar appears to have happened with the current season of Twin Peaks, and when you start to think of it in those terms, its structure, which otherwise seems almost perversely shapeless, begins to make more sense. In the initial announcement by Showtime, the revival was said to consist of nine episodes, and Mark Frost even said to Buzzfeed:

If you think back about the first season, if you put the pilot together with the seven that we did, you get nine hours. It just felt like the right number. I’ve always felt the story should take as long as the story takes to tell. That’s what felt right to us.

It was doubled to eighteen after a curious interlude in which David Lynch dropped out of the project, citing budget constraints: “I left because not enough money was offered to do the script the way I felt it needed to be done.” He came back, of course, and shortly thereafter, it was revealed that the length of the season had increased. Yet there was never any indication that either Lynch or Frost had done any additional writing. My personal hunch is that they always had nine episodes of material, and this never changed. What happened is that the second act of the show expanded in the fashion that I’ve described above, creating a long central section that was free to explore countless byways without much concern for the plot. The beginning, and presumably the end, remained more or less as conceived—it was the middle that grew. And a quick look at the structure of the season so far seems to confirm this. The first three episodes, which take Cooper from inside the Black Lodge to slightly before his meeting with his new family in Las Vegas, seemed weird at the time, but now they look positively conventional in terms of how much story they covered. They were followed by three episodes, the Dougie Jones arc, that were expanded beyond recognition. And now that we’ve reached the final three, which account for the third act of the original outline, it makes sense for Cooper to return at last.

If the season had consisted of just those nine episodes, I suspect that more viewers would have been able to get behind it. Even if the second act had doubled in length—giving us a total of twelve installments, of which three would have been devoted to detours and loose ends—I doubt that most fans would have minded. It’s expanding that middle section to four times its size, without any explanation, that lost a lot of people. But it’s clearly the only way that Lynch would have returned. For most of the last decade, Lynch has been contentedly pottering around with odd personal projects, concentrating on painting, music, digital video, and other media that don’t require him to be answerable to anyone but himself. The Twin Peaks revival, after the revised terms had been negotiated with Showtime, allowed him to do this with a larger budget and for a vastly greater audience. Much of this season has felt like Lynch’s private sketchbook or paintbox, allowing him to indulge himself within each episode as long as the invisible scaffolding of the original nine scripts remained. The fact that so much of the strangeness of this season has been visual and nonverbal points to Lynch, rather than Frost, as the driving force on this end. And at its best, it represents something like a reinvention of television, which is the most expandable or compressible medium we have, but which has rarely utilized this quality to its full extent. (There’s an opening here, obviously, for a fan edit that condenses the season down to nine episodes, leaving the first and last three intact while shrinking the middle twelve. It would be an interesting experiment, although I’m not sure I’d want to watch it.)

Of course, this kind of aggressive attack on the structure of the narrative doesn’t come without a cost. In the case of Twin Peaks, the primary casualty has been the Dougie Jones storyline, which has been criticized for three related reasons. The first, and most understandable, is that we’re naturally impatient to get the old Cooper back. Another is that this material was never meant to go on for this long, and it starts to feel a little thin when spread over twelve episodes. And the third is that it prevents Kyle MacLachlan, the ostensible star of the show, from doing what he does best. This last criticism feels like the most valid. MacLachlan has played an enormous role in my life as a moviegoer and television viewer, but he operates within a very narrow range, with what I might inadequately describe as a combination of rectitude, earnestness, and barely concealed eccentricity. (In other words, it’s all but indistinguishable from the public persona of David Lynch himself.) It’s what made his work as Jeffrey in Blue Velvet so moving, and a huge part of the appeal of Twin Peaks lay in placing this character at the center of what looked like a procedural. MacLachlan can also convey innocence and darkness, but by bringing these two traits to the forefront, and separating them completely in Dougie and Dark Cooper, it robs us of the amalgam that makes MacLachlan interesting in the first place. Like many stars, he’s chafed under the constraints of his image, and perhaps he even welcomed the challenges that this season presented—although he may not have known how his performance would look when extended past its original dimensions and cut together with the rest. When Cooper returned last night, it reminded me of how much I’ve missed him. And the fact that we’ll get him for two more episodes, along with everything else that this season has offered us, feels more than ever like a gift.

The best advice I’ve found for approaching this enormous, daunting book is Roger Shattuck’s observation, in his useful study Proust’s Way, that Marcel Proust’s most immediate precursor is Scheherazade, the legendary author of The Thousand and One Nights. In Search of Lost Time has less in common with the novels that we usually read than with the volumes of myths and fairy tales that we devour in childhood, and it might seem more accessible to the readers who currently find it bewildering if, as Shattuck suggests, it had been titled The Parisian Nights. Proust is a teller of tales, and like Homer, his work is infinitely expansible. An exchange that lasts for a few lines in an oral epic like The Iliad could have been expanded—as it probably was for certain audiences—into an entire evening’s performance, and Homer deploys his metaphors to introduce miniature narratives of human life that don’t otherwise fit into a poem of war. Proust operates in much the same way. One observation leads naturally to another, and an emotion or analogy evoked in passing can unfold like a paper flower into three dense pages of reflections. In theory, any good novel could be expanded like this, like a hypertext that opens into increasingly intimate levels: In Search of Lost Time happens to be the only book in existence in which all of these flowerings have been preserved. Its plot could fit into a novella of two hundred unhurried pages, but we don’t read Proust for the plot, even if he knows more about suspense and surprise than you might expect. His digressions are the journey, and the result is the richest continuous slice of a great writer’s mind that a work of fiction can afford.

And the first thing that you notice about Proust, once you’ve lived in his head for long enough, is that he has essential advice and information to share about everything under the sun. Proust is usually associated with the gargantuan twin themes of memory and time, and although these are crucial threads, they’re only part of a tapestry that gradually expands to cover all human life. At first, it seems a little unfair that our greatest writer on the subject of sexual jealousy should also be a genius at describing, say, a seascape, as well as a mine of insight into such diverse areas as art, class, childhood, travel, death, homosexuality, architecture, poetry, the theater, and how milk looks when it’s about to boil over, while also peopling his work with vivid characters and offering up a huge amount of incidental gossip and social reportage. When you look at it from another angle, though, it seems inevitable. Proust is the king of noticing, and he’s the author who first awakened me to the fact that a major novelist should be able to treat any conceivable topic with the same level of artistic and intellectual acuity. His only rival here is Shakespeare, but with a difference. Plays like Hamlet speak as much in their omissions and silences, leaving us to fill in the gaps. Proust, by contrast, says everything—it’s all there on the page for anyone who wants to unpack it—and you can’t emerge without being subtly changed by the experience. Like Montaigne, Proust gives us words to express thoughts and feelings that we’ve always had, and if you read him deeply enough, you inevitably reach a point where you realize that this novel, which seemed to be about everything else in the world, has been talking about you all along.

You remember the story of the man who believed that he had the Princess of China shut up in a bottle. It was a form of insanity. He was cured of it. But as soon as he ceased to be mad he became merely stupid. There are maladies which we must not seek to cure because they alone protect us from others that are more serious.