Category: New York Times

The phenomenon of female anger has often been turned against itself, the figure of the angry woman reframed as threat — not the one who has been harmed, but the one bent on harming. She conjures a lineage of threatening archetypes: the harpy and her talons, the witch and her spells, the medusa and her writhing locks. The notion that female anger is unnatural or destructive is learned young; children report perceiving displays of anger as more acceptable from boys than from girls. According to a review of studies of gender and anger written in 2000 by Ann M. Kring, a psychology professor at the University of California, Berkeley, men and women self-report “anger episodes” with comparable degrees of frequency, but women report experiencing more shame and embarrassment in their aftermath. People are more likely to use words like “bitchy” and “hostile” to describe female anger, while male anger is more likely to be described as “strong.” Kring reported that men are more likely to express their anger by physically assaulting objects or verbally attacking other people, while women are more likely to cry when they get angry, as if their bodies are forcibly returning them to the appearance of the emotion — sadness — with which they are most commonly associated.

A 2016 study found that it took longer for people to correctly identify the gender of female faces displaying an angry expression, as if the emotion had wandered out of its natural habitat by finding its way to their features. A 1990 study conducted by the psychologists Ulf Dimberg and L.O. Lundquist found that when female faces are recognized as angry, their expressions are rated as more hostile than comparable expressions on the faces of men — as if their violation of social expectations had already made their anger seem more extreme, increasing its volume beyond what could be tolerated.

In “What Happened,” her account of the 2016 presidential election, Hillary Clinton describes the pressure not to come across as angry during the course of her entire political career — “a lot of people recoil from an angry woman,” she writes — as well as her own desire not to be consumed by anger after she lost the race, “so that the rest of my life wouldn’t be spent like Miss Havisham from Charles Dickens’s ‘Great Expectations,’ rattling around my house obsessing over what might have been.” The specter of Dickens’s ranting spinster — spurned and embittered in her crumbling wedding dress, plotting her elaborate revenge — casts a long shadow over every woman who dares to get mad.

If an angry woman makes people uneasy, then her more palatable counterpart, the sad woman, summons sympathy more readily. She often looks beautiful in her suffering: ennobled, transfigured, elegant. Angry women are messier. Their pain threatens to cause more collateral damage. It’s as if the prospect of a woman’s anger harming other people threatens to rob her of the social capital she has gained by being wronged. We are most comfortable with female anger when it promises to regulate itself, to refrain from recklessness, to stay civilized.

Consider the red-carpet clip of Uma Thurman that went viral in November, during the initial swell of sexual-harassment accusations. The clip doesn’t actually show Thurman’s getting angry. It shows her very conspicuously refusing to get angry. After commending the Hollywood women who had spoken out about their experiences of sexual assault, she said that she was “waiting to feel less angry” before she spoke herself. It was curious that Thurman’s public declarations were lauded as a triumphant vision of female anger, because the clip offered precisely the version of female anger that we’ve long been socialized to produce and accept: not the spectacle of female anger unleashed, but the spectacle of female anger restrained, sharpened to a photogenic point. By withholding the specific story of whatever made her angry, Thurman made her anger itself the story — and the raw force of her struggle not to get angry on that red carpet summoned the force of her anger even more powerfully than its full explosion would have, just as the monster in a movie is most frightening when it only appears offscreen.

This was a question I began to consider quite frequently as the slew of news stories accrued last fall: How much female anger has been lurking offscreen? How much anger has been biding its time and biting its tongue, wary of being pathologized as hysteria or dismissed as paranoia? And what of my own vexed feelings about all this female anger? Why were they even vexed? It seemed a failure of moral sentiment or a betrayal of feminism, as if I were somehow siding with the patriarchy, or had internalized it so thoroughly I couldn’t even spot the edges of its toxic residue. I intuitively embraced and supported other women’s anger but struggled to claim my own. Some of this had to do with the ways I’d been lucky — I had experienced all kinds of gendered aggression, but nothing equivalent to the horror stories so many other women have lived through. But it also had to do with an abiding aversion to anger that still festered like rot inside me. In what I had always understood as self-awareness — I don’t get angry. I get sad — I came to see my own complicity in the same logic that has trained women to bury their anger or perform its absence.

London may be the capital of the world. You can argue for New York, but London has a case. Modern London is the metropolis that globalization created. Walk the streets of Holborn, ride an escalator down to the Tube and listen to the languages in the air. Italian mingles with Hindi, or Mandarin, or Spanish, or Portuguese. Walk through the City, the financial district, and listen to the plumbing system of international capitalism. London is banker to the planet.

London is ancient yet new. It is as much city-state as city, with a culture and economy that circulate the world. London manages to be Los Angeles, Washington and New York wrapped into one. Imagine if one American city were home to Hollywood, the White House, Madison Avenue, Wall Street and Broadway. London is sort of that.

Modern London thrives on the idea that one city can be a global melting pot, a global trading house, a global media machine and a place where everyone tolerates everyone else, mostly. The thought is that being connected to the rest of the world is something to celebrate. But what happens to London when that idea unexpectedly falls away?

LONDON — During these last days of the seemingly endless election campaign, we are all living with enormous and ever-rising levels of anxiety. The days pass in a fever of worry, with waves of nausea that subside only to return and rob us of our breath. Many people are having difficulty sleeping. Many others wake up in fright, their bodies drenched with the sweat of Trump terrors. What will America look like after Tuesday? What will the world look like?

We track the news cycle obsessively, compulsively, trying to find clues that might allow us to know what we cannot know and will not know until Wednesday. We may not know even then. Will Trump accept defeat? What if the election is contested for weeks, months? What if there is civil disorder, with blood in the streets? The waiting is agony.

We constantly press the refresh button on The Upshot or whatever lifeline we are clinging to. Foreigners like me try to figure out the possible meaning of those endless sports analogies about field goals. We stare at the screen, look away out of the window and try and focus on something else, and then stare back at the screen again. We pick up handfuls of factoids from the chaos of data that assails us, clutching at the tiny shards of hope glitter on the surface of our media bubble. But then we reject them as hopeless and think to ourselves: “He can’t really win, can he?” Can he?

The mood of nausea at the world, a disgust at the entirety of existence, is familiar to those of us who cut our teeth reading existentialist fiction. Novels like Sartre’s 1938 “Nausea” captured a feeling of disgust with the world and disgust with ourselves for going along with a world so seemingly blissfully happy with itself for so long. For Sartre, the dreadful had already happened, with the rise of National Socialism in the early 1930s, and it was a question of learning to face up to our fate. This is the mood that I want to bring into focus by exploring the concept of Brexistentialism.

For I must admit that I’ve become a Brexistentialist of late, thinking back to that evening on June 23 when I watched the entirety — eight hours or more — of the BBC’s live coverage of the referendum on whether Britain would leave the European Union or choose to remain.

This year marks the 100th anniversary of the publication of “A Portrait of the Artist as a Young Man.” Its genesis was long and tortuous — Joyce began writing his novel in 1904 — and the road to its canonisation as one of the seminal works of Western literature was not short either: The reviews spoke of the author’s “cloacal obsession” and “the slime of foul sewers,” comments that seem strange today, insofar as it is the subjective aspect of the book, the struggle that goes on inside the mind of its young protagonist, that perhaps stands out to us now as its most striking feature. What appeared at the time to be unprecedented about the novel seems more mundane to us today, whereas what then came across as mundane now seems unprecedented. Joyce’s novel remains vital, in contrast to almost all other novels published in 1916, because he forcefully strived toward an idiosyncratic form of expression, a language intrinsic to the story he wanted to tell, about the young protagonist, Stephen Dedalus, and his formative years in Dublin, in which uniqueness was the very point and the question of what constitutes the individual was the issue posed.

The first time I heard about James Joyce, I was 18 years old, working as a substitute teacher in a small community in northern Norway and wanting to be a writer. That ambition had prompted me to subscribe to a literary journal, and in one of the issues that came in the mail there was a series of articles on the masterpieces of modernism — among them Joyce’s novel “Ulysses,” which he published six years after “Portrait,” and which also featured the character of Stephen Dedalus. The word “modernism” evoked in me a vague notion of machines, futuristic and shiny, and when I read about the tower that Stephen inhabits at the beginning of the book, I imagined some sort of medieval world of turreted castles, though with cars of the 1920s and airplanes, a place populated by young men reciting works in Latin and Greek; in other words, something very remote from the world in which I resided, with its quaysides and fishing boats, its steeply rising fells and icy ocean, its fishermen and factory workers, TV programs and pounding car-audio systems. I longed to get away. What I wanted was to write, and I resolved to read this marvelous work and be illuminated by all its radiance. For me, at that time, literature represented somewhere else, and my conception of “Ulysses” was tinged by the books I had read, the boyhood excesses lived out in the French fantasies of Jules Verne, for instance, or swashbuckling classics like “The Count of Monte Christo,” “The Three Musketeers,” “Ivanhoe” or “Treasure Island” — imaginary worlds in which I had lived half my life and which for me were the very essence of literature. Literature was somewhere other than me, so I thought, and related to that was another idea I had, that everything of meaning was to be found at the center, that only there did important things happen, while all that occurred on the periphery — where I felt I was — was without significance and unworthy of being written about. History belonged to others, literature belonged to others, truth belonged to others.

Twice a week or so, loaded with bodies boxed in pine, a New York City morgue truck passes through a tall chain-link gate and onto a ferry that has no paying passengers. Its destination is Hart Island, an uninhabited strip of land off the coast of the Bronx in Long Island Sound, where overgrown 19th-century ruins give way to mass graves gouged out by bulldozers and the only pallbearers are jail inmates paid 50 cents an hour.

There, divergent life stories come to the same anonymous end.

No tombstones name the dead in the 101-acre potter’s field that holds Leola Dickerson, who worked as one family’s housekeeper for 50 years, beloved by three generations for her fried chicken and her kindness. She buried her husband as he had wished, in a family plot back in Alabama. But when she died at 88 in a New York hospital in 2008, she was the ward of a court-appointed guardian who let her house go into foreclosure and her body go unclaimed at the morgue.

By law, her corpse became city property, to be made available as a cadaver for dissection or embalming practice if a medical school or mortuary class wanted it. Then, like more than a million men, women and children since 1869, she was consigned to a trench on Hart Island.

Several dozen trenches back lies Zarramen Gooden, only 17 when the handlebars of his old bike broke and he hit his throat, severing an artery. He had been popping wheelies near the city homeless shelter in the Bronx where he and four younger siblings lived with their heroin-addicted mother. With no funeral help from child protection authorities, his older sister scraped together $8 to buy the used suit he wore at his wake. But the funeral home swiftly sent him back to the morgue when she could not pay the $6,000 burial fee.

For Milton Weinstein, a married father with a fear of dying alone, there was no burial at all for two years after his death at 67. A typographer in his day, he had worked in advertising for Sears, Roebuck & Company. But he lost his career to technology and his vision to diabetes; his wife’s mental problems drove their children away. Though she was at his side when he died in a Bronx nursing home, she had no say over what happened to his remains — and no idea that his body would be used as a cadaver in a medical school and then shoveled into a mass grave on Hart Island.

New York is unique among American cities in the way it disposes of the dead it considers unclaimed: interment on a lonely island, off-limits to the public, by a crew of inmates. Buried by the score in wide, deep pits, the Hart Island dead seem to vanish — and so does any explanation for how they came to be there.

To reclaim their stories from erasure is to confront the unnoticed heartbreak inherent in a great metropolis, in the striving and missed chances of so many lives gone by. Bad childhoods, bad choices or just bad luck — the chronic calamities of the human condition figure in many of these narratives. Here are the harshest consequences of mental illness, addiction or families scattered or distracted by their own misfortunes.

But if Hart Island hides individual tragedies, it also obscures systemic failings, ones that stack the odds against people too poor, too old or too isolated to defend themselves. In the face of an end-of-life industry that can drain the resources of the most prudent, these people are especially vulnerable.

Indeed, this graveyard of last resort hides wrongdoing by some of the very individuals and institutions charged with protecting New Yorkers, including court-appointed guardians and nursing homes. And at a time when many still fear a potter’s field as the ultimate indignity, the secrecy that shrouds Hart Island’s dead also veils the city’s haphazard treatment of their remains.

At noon the next day I looked out the window of my plane from Toronto, staring down at the outskirts of Cleveland, an endless row of streets with identical houses, beneath the dirty gray light of a misty, freezing sky.

At last I was in the United States. And in my backpack were the documents proving I had a driver’s license, stamped by the Swedish Embassy in Washington. I had agreed to meet the photographer at the airport and travel with him for a day, but then I would attempt to rent a car and continue on my own.

I didn’t really enjoy talking to people that much, at least not to strangers, and the thought of spending the next five days in a car with someone I didn’t know was a bit unsettling. In this case, I also had a hunch that the photographer was in many respects my total opposite. We had exchanged some text messages about where to go and what to see. The first thing he had written to me, was this:

Detroit is fascinating. I know this wild and lovely family living there. 14 people living in a house. Smoking a lot of dope. In Wisconsin I know a vet who is a character. Anyhow regardless it’s pretty easy to get into weird and fascinating situations in this country. . . .

What I wanted to see were the woods, the meadows and small towns of Maine and Vermont, and the last thing I wanted was to end up in what he called “weird and fascinating situations.” I had a deep-seated fear of drugs, in any form, they represented a kind of transgression that I found deeply disturbing. Just seeing the word “heroin” shook me to the core, there was something diabolical about it. Being so close to chaos all the time, I feared nothing more than the things that could unleash it, and I knew I would love heroin, as I loved everything that took me away from the present moment. I have based my life upon saying no to all kind of temptations, like taking time off, going on holiday, having a drink in the afternoon, staying up late at night. Was I going to sit in a derelict house with a family of 14 who all smoked marijuana? And what would I say to the poor veteran? How was it in the war? What are you doing now?

Inside the terminal, I stopped at a bookstore to look at travel guides. I still hadn’t decided where to go, the only sure thing was that I would end up in Alexandria on Monday, and a travel guide could suggest routes to follow and provide information about the places we drove through. One of my favorite books about the U.S. is Vladimir Nabokov’s “Lolita,” which among many other things is also a kind of road novel. It describes a journey through the small-town world of post-World War II America, where the protagonist, Humbert Humbert, is constantly on the lookout for distractions for his child mistress, and therefore stops at an endless series of attractions, which every single little town seemed to be in possession of. The world’s largest stalagmite, obelisks commemorating battles, a reconstruction of the log cabin where Lincoln was born, the world’s longest cave, the homemade sculptures of a local woman. Humbert’s gaze is European, deeply sophisticated, cultivated and ancient, but also perverted and sick, while the things he observes on the journey across America are superficial, childishly un-self-conscious, ignorant of history, but also innocent and possessed of the freshness of the new.

“Lolita” came out in the U.S. in 1958, one year after another road novel, Jack Kerouac’s “On the Road.” Oddly enough, the journeys that these two books describe also begin at the same time: Both Humbert and Lolita, and Sal and Dean, hit the American country road in 1947. It would be hard to imagine two more dissimilar fictional landscapes. This is because Kerouac describes it from the inside, with no distance, this is the America he grew up in, and he is so much an integrated part of it that he seems to embody its very soul. It is a young, restless, hungry, open soul. There are no points of contact with that America in Nabokov’s novel, and if you read the two books simultaneously, the reason becomes obvious: In “Lolita,” all is dissembling, there are only signs, everything stands for something else, and the one and perhaps only thing that is authentic, the child’s reality, is desired from an impossible distance, the breaching of which destroys it completely. In “On the Road,” nothing stands in the way of the authentic, except the rules of formal life; when they have been overcome, the glittering night opens to anyone who desires to enter it. The naïveté of this is astounding, but so is the power.

Now, both Nabokov’s book and Kerouac’s were nearly 60 years old, and themselves a part of this country’s history. But the conflict between life and the imitation of life, and the impossible desire for authenticity, was still being explored in American literature, where entire human destinies could be played out in the interiors of historical theme parks and which, time and time again, allowed the perverted psychological language of sincerity and caring to collide with the evasive and unacknowledged emotions of lived life, often in comical ways, sometimes in ways that seemed desperate and claustrophobic.

I couldn’t find any travel guides, and I bought a notebook and a pen instead, paid, and put them in my backpack. Then I walked out into the wide corridor again and headed for the baggage claim.

“Karl Ove?” said a voice behind me.

I turned around. A young man of around 30, dressed in a dark, slightly shabby coat, with curly hair and glasses, stood there looking at me.

“Peter, the photographer,” he said and put out his hand. “I thought you might be somewhere around here.”

IN THE WORLD of early-20th-century African-American music and people obsessed by it, who can appear from one angle like a clique of pale and misanthropic scholar-gatherers and from another like a sizable chunk of the human population, there exist no ghosts more vexing than a couple of women identified on three ultrarare records made in 1930 and ’31 as Elvie Thomas and Geeshie Wiley. There are musicians as obscure as Wiley and Thomas, and musicians as great, but in none does the Venn diagram of greatness and lostness reveal such vast and bewildering co-extent. In the spring of 1930, in a damp and dimly lit studio, in a small Wisconsin village on the western shore of Lake Michigan, the duo recorded a batch of songs that for more than half a century have been numbered among the masterpieces of prewar American music, in particular two, Elvie’s “Motherless Child Blues” and Geeshie’s “Last Kind Words Blues,” twin Alps of their tiny oeuvre, inspiring essays and novels and films and cover versions, a classical arrangement.

Yet despite more than 50 years of researchers’ efforts to learn who the two women were or where they came from, we have remained ignorant of even their legal names. The sketchy memories of one or two ancient Mississippians, gathered many decades ago, seemed to point to the southern half of that state, yet none led to anything solid. A few people thought they heard hints of Louisiana or Texas in the guitar playing or in the pronunciation of a lyric. We know that the word “Geechee,” with a c, can refer to a person born into the heavily African-inflected Gullah culture centered on the coastal islands off Georgia and the Carolinas. But nothing turned up there either. Or anywhere. No grave site, no photograph. Forget that — no anecdotes. This is what set Geeshie and Elvie apart even from the rest of an innermost group of phantom geniuses of the ’20s and ’30s. Their myth was they didn’t have anything you could so much as hang a myth on. The objects themselves — the fewer than 10 surviving copies, total, of their three known Paramount releases, a handful of heavy, black, scratch-riven shellac platters, all in private hands — these were the whole of the file on Geeshie and Elvie, and even these had come within a second thought of vanishing, within, say, a woman’s decision in cleaning her parents’ attic to go against some idle advice that she throw out a box of old records and instead to find out what the junk shop gives. When she decides otherwise, when the shop isn’t on the way home, there goes the music, there go the souls, ash flakes up the flue, to flutter about with the Edison cylinder of Buddy Bolden’s band and the phonautograph of Lincoln’s voice.

I have been fascinated by this music since first experiencing it, like a lot of other people in my generation, in Terry Zwigoff’s 1994 documentary “Crumb,” on the life of the artist Robert Crumb, which used “Last Kind Words” for a particularly vivid montage sequence. And I have closely followed the search for them over the years; drawn along in part by the sheer History Channel mysteriousness of it, but mainly — the reason it never got boring — by their music.

Outside any bullyingly hyperbolical attempts to describe the technical beauty of the songs themselves, there’s another facet to them, one that deepens their fascination, namely a certain time-capsule dimension. The year 1930 seems long ago enough now, perhaps, but older songs and singers can be heard to blow through this music, strains in the American songbook that we know were there, from before the Civil War, but can’t hear very well or at all. There’s a song, Geeshie’s “Last Kind Words,” a kind of pre-blues or not-yet-blues, a doomy, minor-key lament that calls up droning banjo songs from long before the cheap-guitar era, with a strange thumping rhythm on the bass string. “If I get killed,” Geeshie sings, “if I get killed, please don’t bury my soul.” There’s a blues, “Motherless Child,” with 16-bar, four-line stanzas, that begins by repeating the same line four times, “My mother told me just before she died,” AAAA, no variation, just moaning the words, each time with achingly subtle microvariations, notes blue enough to flirt with tonal chaos. Generations of spirituals pass through “Motherless Child,” field melodies and work songs drift through it, and above everything, the playing brims with unfalsifiable sophistication. Elvie’s notes float. She sends them out like little sailboats onto a pond. “Motherless Child” is her only song, the only one of the six on which she takes lead to my ears — there are people who think it’s also her on “Over to My House.” On the other songs she’s behind Geeshie, albeit contributing hugely. The famous Joe Bussard (pronounced “buzzard”), one of the world’s foremost collectors of prewar 78s, found one of two known copies of “Motherless Child” in an antique store in Baltimore, near the waterfront, in the mid-1960s. The story goes that Bussard used to have people over to his house to play for them the first note of “Motherless Child,” just the first few seconds, again and again, an E that Elvie plucks and lets hang. It sounds like nothing and then, after several listens, like nothing else. “Baby, now she’s dead, she’s six feet in the ground,” she sings. “And I’m a child, and I am drifting ’round.”

Before there could be the minor miracle of these discs’ having survived, there had to be an earlier, major one: that of people like Geeshie and Elvie ever being recorded. To understand how that happened it’s needful to know about race records, a commercial field that flourished between the world wars, and specifically the Paramount company, a major competitor in that game throughout the 1920s.

A furniture company, that’s how it started. The Wisconsin Chair Company. They got into making phonograph cabinets. If people had records they liked, they would want phonographs to play them on, and if they had phonographs, they would want cabinets to keep them in. The discs were even sold, especially at first, in furniture shops. They were literally accessories. Toys, you could say. In fact, the first disc “records” were manufactured to go with a long-horned gramophone distributed by a German toy company. So we must imagine, it’s as if a subgenre of major American art had been preserved only on vintage View-Master slides.

In 1920, when the white-owned OKeh label shocked even itself by selling hundreds of thousands of copies of Mamie Smith’s “Crazy Blues” (the first blues recorded by an African-American female vocalist), the furniture-phonograph complex spied a chance. Two populations were forming or achieving critical mass, whites willing to pay for recordings of black music and blacks able to afford phonographs, and together they made a new market. It’s around then that the actual phrase “race records” enters the vernacular. In 1926, Paramount had game-changing luck on a string of 78s showcasing the virtuosic Texas songster Blind Lemon Jefferson — his “Long Lonesome Blues” sold into the six figures — and as in Mamie Smith’s case, he touched off a frantic search among labels to find performers in a similar vein. The “country blues” was born, though not yet known by that name. It was men, for the most part, but with an important female minority, a “vital feminizing force,” in the words of Don Kent, the influential collector and poet of liner notes.

For the preserving of that force we have to thank not the foresight of those recording companies but their ignorance and even philistinism when it came to black culture. They knew next to nothing about the music and even less about what new trends in it might appeal to consumers. Nowhere was this truer than at Paramount. These were businessmen, Northern and Midwestern, former salesmen. Their notions of what was a hit and what was not were a Magic Eight Ball. So, when the mid-1920s arrived, and Paramount went looking farther afield for new acts, they compensated by recording everything and waiting to see what sold. Not everything, but a lot. A long swath of everything. The result was an unprecedented, never-to-be-repeated, all-but-unconscious survey of America’s musical culture, a sonic X-ray of it, taken at a moment when the full kaleidoscopic variety of prerecording-era transracial forms hadn’t yet contracted. Hundreds of singers, more thousands of songs. Some of the greatest musicians ever born in this country were netted only there. It was a slapdash and profit-driven documentary project that in some respects dwarfed what the most ambitious and well intentioned ethnomusicologists could hope to achieve (deformed in all sorts of ways by capitalism, but we take what we can get).

Among the first to wake up to these riches was, as it happens, the most prominent of those great ethnomusicologists, Alan Lomax. He had been traveling the back roads with his father, John A. Lomax, making field recordings for the Library of Congress’s Archive of American Folk Song, and he had seen firsthand that all of this culture, which had endured mouth-to-ear for centuries, was giving way, proving not quite powerful enough to resist the radio waves and movies. In the late ’30s, Lomax was record hunting one day and came across a large cache of old Paramount discs in a store. At the time they were a mere 10 or 15 years old and couldn’t have appeared less valuable to a casual picker. Lomax listened, transfixed by an increasing realization that Paramount offered him an earhole into the past, into the decade just before he joined his father on the song-collecting scene, an enormous commercial complement to what the two of them had been doing under intellectual auspices with their field-recording. Lomax started digging. In 1940 he created a list, with the title “American Folk Songs on Commercial Records,” and circulated it in the folklore community.

This list is a very precious little document in 20th-century American cultural history. It was published in only a limited library report, but copies were passed around. It marked the first time someone had publicly recognized these commercial recordings as something other than detritus. Most important, it made space for, even emphasized, the more obscure blues singers.

To grasp the significance of that, you have to bear in mind how fantastically few record collectors possessed such an interest at the end of the 1930s. Early jazz was a thing in certain hip circles, but only a few true freaks were into the country blues. There was twitchy, rail-thin Jim McKune, a postal worker from Long Island City, Queens, who famously maintained precisely 300 of the choicest records under his bed at the Y.M.C.A. Had to keep the volume low to avoid complaints. He referred to his listening sessions as séances. Summoning weird old voices from the South, the ethereal falsetto of Crying Sam Collins. Or the whine of Isaiah Nettles, the Mississippi Moaner. Did McKune listen to Geeshie and Elvie? It’s unknowable. His records were already gone when he died — murdered in 1971, in a hotel room. Another early explorer? The writer Paul Bowles. The Paul Bowles, believe it or not, who started collecting blues records as an ether-huffing undergraduate in Charlottesville, Va., in the late 1920s, “at secondhand furniture stores in the black quarter.” Out West there was Harry Smith, who went on to create the “Anthology of American Folk Music” for Folkways Records, the first “box set,” of which it can be compactly if inadequately said: No “Anthology,” no Woodstock. Wee, owlish Smith. He and McKune came to know each other. No less important, they both came to know Alan Lomax’s list, which galvanized their passion for this particular chamber of the recorded past, giving shape to their “want lists.”

In the ’50s McKune would become a sort of salon master to the so-called Blues Mafia, the initial cell of mainly Northeastern 78-pursuers who evolved, some of them, into the label owners and managers and taste-arbiters of the folk-blues revival. An all-white men’s club, several of whom were or grew wealthy, the Blues Mafia doesn’t always come off heroically in recent — and vital — revisionist histories of the field, more of them being written by women (including two forthcoming books by Daphne Brooks and Amanda Petrusich). Still, no one who seriously cares about the music would pretend that the cultural debt we owe the Blues Mafia isn’t past accounting. It’s not just all they found and documented that marks their contribution. It’s equally what they spawned, whether they would claim it or not. Dylan didn’t listen to 78s, after all, on the floors of those pads he was crashing at in Greenwich Village, but to the early reissue LPs. By Dylan I mean the ’60s. But also Dylan. “If I hadn’t heard the Robert Johnson record when I did,” he wrote 10 years ago, “there probably would have been hundreds of lines of mine that would have been shut down.”

The author who writes under the pseudonym Elena Ferrante responded to written questions via email through her longtime Italian publisher, Sandra Ozzola Ferri. The following is a translated transcript of that interview.

Q. You insist on anonymity and yet are developing a cult following, especially among women, first in Italy and now in the United States and beyond. How do you feel about the reception of your books in the United States in recent years, and your growing readership, especially after James Wood’s review in The New Yorker in January 2013?

A. I appreciated James Wood’s review very much. The critical attention that he dedicated to my books not only helped them find readers but in a way it also helped me to read them. Writers, because they write, are condemned never to be readers of their own stories. What happens to the reader when he reads a story for the first time is effectively what the narrator experiences while he writes. The memory of first putting a story into words will always prevent writers from reading their work as an ordinary reader would. Critics like Wood not only help readers to read but especially, perhaps, help the author as well. Their function also becomes fundamental in helping faraway literary worlds to migrate. I never asked myself how the women in my stories would be received outside Italy. I wrote first and foremost for myself, and if I published I did so leaving the task of finding readers to the book itself. Now I know that thanks to Europa Editions [Ferrante’s English-language publisher], to Ann Goldstein [her English-language translator] and to Wood and so many other reviewers and writers and readers, the heart of these stories has burst forth, and it is not only Italian. I’m both surprised and happy.

Q. Do you feel your books have found the following they deserve in Italy?

A. I don’t do promotional tours in my own country or anywhere. In Italy my first book, “Troubling Love,” sold immediately, thanks to the word of mouth of readers who discovered it and appreciated the writing, and to reviewers who wrote about it positively. Then the director Mario Martone read it and turned it into a memorable film. This helped the book, but it also shifted the media attention onto me personally. Partly for that reason, I didn’t publish anything else for 10 years, at which point, with tremendous anxiety, I decided to publish “The Days of Abandonment.” The book was a success and had a wide readership, even if there was also a lot of resistance to [the protagonist] Olga’s reaction to being abandoned, — the same kind of resistance faced by Delia [the protagonist] in “Troubling Love.” The success of the book and of the film that was made from it focused even more attention onto the absence of the author. It was then that I decided, definitively, to separate my private life from the public life of my books, which overcame countless difficulties and have endured. I can say with a certain pride that in my country, the titles of my novels are better known than my name. I think this is a good outcome.

Q. Where do you see yourself in the Italian literary tradition?

A. I’m a storyteller. I’ve always been more interested in storytelling than in writing. Even today, Italy has a weak narrative tradition. Beautiful, magnificent, very carefully crafted pages abound, but not the flow of storytelling that despite its density manages to sweep you away. A bewitching example is Elsa Morante. I try to learn from her books, but I find them unsurpassable.

“Gravity’s Rainbow,” Thomas Pynchon’s gargantuan parable of rocketry, sex and a whole bunch of other stuff, turned 41 this year — six years older than its author when it was first published. What happens when a novel whose scenes of coprophagia and pedophilia moved Pulitzer trustees to cancel the prize in 1974 (when Pynchon seemed poised to win) eases into middle-aged, canonical respectability? Well, for one thing, it gets an audiobook release. Since the mid-1980s, a George Guidall recording has been floating around, like some mythical lost rocket part — no one had heard it, but all Pynchon fans knew someone who knew someone who had — but in October a new version, authorized and rerecorded and burned onto 30 compact discs — hit the stands. How on earth, I wondered as I stripped the wrapper, is poor Mr. Guidall going to render the sudden outbreaks of crazed capitals, or librettos in which stoners with guitars pastiche Rossini, the instructions helpfully stating “(bubububoo[oo] oo [sung to opening of Beethoven 5th, with full band])”? He turns out to do it in a slow and deep-voiced manner, beneath whose calm avuncularity you can detect anxiety, even mania, bubbling but never quite erupting — although I could have sworn I heard him, in the silence at the end of CD 30, racing out the door to buy a year’s supply of those Thayer’s Slippery Elm Throat Lozenges the hero Slothrop sucks, or perhaps to check himself into the book’s White Visitation mental hospital.

The main benefit of Guidall’s superhuman effort may well be ergonomic. Unlike Grigori, the novel’s reflex-­conditioned octopus, a human reader has only two hands; removing the book as a physical object frees up one of these to palm through Steven Weisenburger’s “A ‘Gravity’s Rainbow’ Companion” and the other to click around the many online glossaries to the text while listening. Such resources seem to me more than optional; only “Finnegans Wake” is more opaque, more reference-­saturated than Pynchon’s novel. And perhaps the benefits of audio end here. An old canard, reeled off incessantly by people who haven’t read “Finnegans Wake,” or at least haven’t understood it, holds that “you need to hear it spoken aloud in order to appreciate it.” This is nonsense: Joyce’s novel, wrapped around in silence, is all about legibility — inscriptions, codices, scattered scraps of paper in need of reassembly, exegesis or decoding. So it is with “Gravity’s Rainbow”: The book’s logic is entirely scriptural. Every surface in it is a parchment to be interpreted: ice-cracks form “graffiti . . . a legend to be deciphered”; raindrops splash in asterisks, inviting us “to look down at the bottom of the text of the day”; lit cigarettes trail “cursive writing”; even feces on the walls of sewers presents “patterns thick with meaning.” Its people, lying in rows in a hospital ward described as a “half-open file drawer of pain each bed a folder,” are legible as well, “poor human palimpsests” that doctors transcribe and — more sinisterly — rewrite. Slothrop’s ancestors, as Puritans, scanned the sky for messages, viewed all of nature as a ledger packed with data “behind which always, nearer or farther, was the numinous certainty of God.” In keeping with their sensibility, the narrative momentum thrusts both forward, toward inevitable (because predestined) final catastrophe and judgment and, simultaneously, backward, through histories encrypted into junkyards, light bulbs, even human hair, reverse-engineering cities into ruins, rooms to their “plan views” in order to lay bare the plans hatched in them, plans in whose web all current actors find themselves entangled. One character is tellingly urged, in a direct address by the narrator, to ponder a “wind tunnel” theory of history, through which tensor analysis might reveal “nodes, critical points,” turbulence-spots decisive in the shaping of all subsequent airflow, only now become apparent — then told, “Here’s a thought: Find a non-­dimensional coefficient for yourself.”