It was a picnic-perfect summer in 1914. The rich flaunted their wealth with all the subtlety of rats leaping onto a pristine wedding dress. The newspapers steered their coverage away from serious events to pursue lurid items about sports and celebrity gossip. A comic double act by the name of Collins & Harlan recorded an absurd ditty called “Aba Daba Honeymoon,” which Thomas Pynchon was to describe fifty years later as “the nadir of all American expression.” Few human souls twirling their canes and parasols in these conditions of unbridled frivolity could have anticipated that an archduke’s assassination in late June would plunge Europe into a gruesome war that would leave twenty million dead, permanently altering notions of honor, bloodshed, and noblesse oblige.

But even a few years before the July Crisis, there were strong signs in England that something was amiss. Politicians demonstrated a cataclysmic failure to read or address the natural trajectory of human progress. Women justly demanded the right to vote and were very willing to starve themselves in prison and burn down many buildings for it. Workers fought violently for fair wages, often locked into stalemates with greedy mining companies. They were intoxicated by a new militant brand of syndicalism from France then popularized by Georges Sorel. The atmosphere was one of increasing upheaval and escalated incoherence, even among the most noble-minded revolutionaries. The influx of gold from Africa inspired both lavish spending and an inflated currency. The liberals in power were supposed to stand up for the working stiffs who couldn’t quite meet the rising prices for boots and food and clothes with their take home pay. And much like today’s Democratic Party in the States, these tepid Parliamentary wafflers past their Fabian prime revealed a commitment to ineptitude over nuts-and-bolts pragmatism. They allowed the Tories to play them like rubes losing easy games of three-card monte. Amidst such madness, England became a place of oblivious tension not dissimilar to the nonstop nonsense that currently plagues both sides of the Atlantic. With the middle and upper classes keeping their heads in the clouds and their spirits saturated in moonbeam dreams and a bubble gum aura, is it any wonder that people were willing to incite war and violence for the most impulsive reasons?

George Dangerfield’s The Strange Death of Liberal England examines this crazed period between 1910 and 1914 with an exacting and quite entertaining poetic eye. Dangerfield, an erudite journalist who parlayed his zingy word-slinging into a teaching career, is somewhat neglected today, but his remarkable knack for knowing when to suggest and when to stick with the facts is worthy of careful study, a summation of the beautifully mordant touch he brought as a historian. He describes, for example, the “dismal, rattling sound” of Liberalism refusing to adjust to the times, and eloquently sends up the out-of-touch movement in a manner that might also apply to today’s neoliberals, who stubbornly refuse to consider the lives and needs of the working class even as they profess to know what’s best for them:

[I]t was just as if some unfortunate miracle had been performed upon its contents, turning them into nothing more than bits of old iron, fragments of intimate crockery, and other relics of a domestic past. What could be the matter? Liberalism was still embodied in a large political party; it enjoyed the support of philosophy and religion; it was intelligible, and it was English. But it was also slow; and it so far transcended politics and economics as to impose itself upon behaviour as well. For a nation which wanted to revive a sluggish blood by running very fast and in any direction, Liberalism was clearly an inconvenient burden.

Dangerfield knew when to let other people hang themselves by their own words. The infamous Margot Asquith, the starry-eyed socialite married to the Prime Minister who led England into World War I, is quoted at length from her letters to Robert Smillie, the brave union organizer who fought on behalf of the Miners’ Federation of Great Britain. Asquith, so fundamentally clueless about diplomacy, could not understand why meeting Smillie might be a bad idea given the tense negotiations.

I did feel that Dangerfield was unduly harsh on Sylvia Pankhurst, one of the key organizers behind the suffragette movement. His wry fixation upon Pankhurst’s indomitable commitment — what he styles “the fantastic Eden of militant exaltation” — to starvation and brutality from the police, all in the brave and honorable fight for women, may very well be a product of the 1930s boys’ club mentality, but it seems slightly cheap given how otherwise astute Dangerfield is in heightening just the right personality flaws among other key figures of the time. The Pankhurst family was certainly eccentric, but surely they were deserving of more than just cheap quips, such as the volley Dangerfield lobs as Christabel announces the Pankhurst withdrawal from the WSPU (“She made this long-expected remark quite casually — she might almost have been talking to the little Pomeranian dog which she was nursing.”).

Still, Dangerfield was the master of the interregnum history. His later volume, The Era of Good Feelings, examined the period between Jefferson and Jackson and is almost as good as The Strange Death. One reads the book and sees the model for Christopher Hitchens’s biting erudite style. (The book was a favorite of Hitch’s and frequently cited in his essays.)

But it is clear that Dangerfield’s heart and his mischievous vivacity resided with his homeland rather than the nation he emigrated to later in life. In all of his work, especially the material dwelling on the United Kingdom, Dangerfield knew precisely what years to hit, the pivotal moments that encapsulated specific actions that triggered political movements. As he chronicles the repercussions of the June 14, 1911 strike in Southampton, he is careful to remark upon how “it is impossible not to be surprised at the little physical violence that was done — only a few men killed, in Wales in 1912, and two or three in Dublin in 1913; in England itself not a death. Is this the effect of revolutionary methods, and, if so, do the methods deserve the word?” He then carries on speculating about the pros and cons of peaceful revolution and ties this into the “spiritual death and rebirth” of English character. And we see that Dangerfield isn’t just a smartypants funnyman, but a subtle philosopher who leaves human possibilities open to the reader. He is a welcome reminder that seeing the real doesn’t necessarily emerge when you lock eyes on an alluring Twitch stream or a hypnotic Instagram feed. It comes when you take the time to step away, to focus on the events that are truly important, and to ruminate upon the incredible progress that human beings still remain quite capable of making.

It is worth recalling that the Boy Scout, that putative paragon of American boyhood virtue, originated in 1909 with a man lost in the foggy haze of a mazy London byway. W.D. Boyce was a recently divorced newspaperman cast adrift in the English mist, until he was guided by a uniformed lad known only as the Unknown Scout. This young whippersnapper, who was no soldier and had no tomb (unless you count a mangy Silver Buffalo memorial that presently stands in Gilwell Park), steered Boyce to his destination and refused Boyce’s tip after that gent hoped to consummate his gratitude. The boy did so not because he was a well-paid German stevedore or a terrified Uber driver hoping to hold onto his job, but because he was merely doing his duty and this was enough recompense, thank you very much. From here, Boyce asked the boy about his coterie, was allegedly led to Boy Scouts HQ like a starry-eyed drifter seeking a new easy access religion, and encountered Chief Scout Robert Baden-Powell, an irrepressible do-gooder who intoxicated Boyce with tales of uniforms and valor and decency and truth and justice and many other nouns etched with ostentatious pedigree and scant subtlety that were later memorialized in a handbook published in six fortnightly parts called Scouting for Boys. Four months later, Boyce returned to the States to found the Boy Scouts of America. He had found his calling. Shortly after this, presumably emboldened by the new youthful virtues flooding through his veins, Boyce would marry a well-connected woman twenty-three years younger. But Boyce’s brio was not enough to preserve this second marriage, which dissolved within two years. The Boy Scouts, on the other hand, have continued to endure, albeit with plentiful dissimulation saddled to the “Be prepared” credo.

This legend, which isn’t nearly as imaginative or as thrilling as Robert Johnson signing away his soul to the devil in exchange for spangling guitar chops, has nevertheless become as accepted and as apocryphal as the birth of the blues or any story of rugged outliers founding tech startups in their garages or, for that matter, the cloying cherry tree myth associated with George Washington, a shrewd political operator who claimed that he could not tell a lie despite deceiving many over a lifetime about his professed lack of political expertise. Boy Scout booster (and sex therapist!) Edward Rowan has pointed out that Boyce outed himself in a February 27, 1928 letter, claiming that he was not floating in the Dickensian murk, but merely standing before the Savoy Hotel while contemplating the question of whether he should cross a street. Moreover, others have suggested that there was no fog during that evening. As one excavates further into W.D. Boyce’s history, one learns that this sanctimonious founder was a racist, even denying African-Americans entry into the hallowed organization. (Boyce also published a journal called The White Boy’s Magazine.) By the late 1980s, the Boy Scouts were forced to establish protective measures in response to countless sex abuse cases later documented by reporter Patrick Boyle in 1991. The Boy Scouts of America, a seemingly sacred institution, had been little more than a seductive shawl disguising the ugly American id. It is thus the perfect metaphor for Tobias Wolff’s This Boy’s Life, a moving memoir about a boy wrestling with the lies, the duplicities, and the hypocrisies of growing up in America. It is an especially cogent volume in an epoch of fake news, covfefe, and thundering Republican men casually asphyxiating the weak and the vulnerable in the name of old school virtue.

For young Tobias (aka Jack, a sobriquet inspired by an altogether different London), the deceptive pose was a way of being and coping through a rough-and-tumble existence. This Boy’s Life opens with Toby and his mother retreating from an abusive man in Florida by way of a dodgy Nash Rambler with an overboiling engine. Their hope was to find fortune through a desperate uranium hunt by way of a poor man’s Geiger counter. Like many Americans before, the westward journey here is one of escape and, as one pores through the memoir’s crisply paced pages, increasingly about assuming roles that bear no resemblance to reality. The cooing pop songs crooning from the radio provide voices for Toby to emulate, perhaps serving as a staging area for transformation. Yet Catholicism, itself a practice just as fraught with frangible self-abengation as the Boy Scouts, also represents the new terrain from which to launch an identity. Toby’s father, telephoning from Connecticut, claims that the family line has always been Protestant or Episcopalian, but Wolff informs us that he learned of his true Jewish heritage ten years after this revelation. Names, identities, veneers, and backgrounds are the melting pot from which to sprout a respectable soul, yet Toby scoffs at the purported innocence of this problematic chrysalis. “Power,” writes Wolff, placing his budding irritation within the context of his later experience in Vietnam, “can only be enjoyed when it is recognized and feared. Fearlessness in those without power is maddening to those who have it.”

Woolf’s Vietnam memoir, In Pharaoh’s Army, would chronicle similar tensions between patriotic duty and survival, and one must observe that the two memoirs are united by Wolff possessing a gun, that priapic symbol of American manhood that has caused so much recent and needless terror. This Boy’s Life sees these uncertain seeds planted in loam long before basic training. I once had the good fortune to interview Wolff in 2008 and he revealed that he was dead set against narcissism’s pathology overtaking any story. Which leads me to believe that Wolff understands, as William Gass has observed in a notable essay on narcissism and writing, that autobiographers turn themselves into monsters, often hiding deceit behind their confessions. To reckon poignantly with a life, a memoirist must never cover up his shame or settle scores with self-serving vigor, for he invites a dishonesty in which the professed act of soul-bearing smudges the more important ink needed for corrupted but authentic memory.

What is most striking about This Boy’s Life is that Wolff never sugarcoats his life. Nor does he beckon the reader to feel sympathy for him, even as he succumbs to abuse from Dwight, the abominable man whom his mother Rosemary eventually tries to forge a family with. It is the shakiest of new beginnings following an uncertain stint at a West Seattle boardinghouse. There are men who hit on Rosemary, ascribing athleticism to Toby and pledging bicycle gifts that never materialize, and we see only Rosemary’s tears from unseen boorishness. Toby steals and breaks windows with his pals. He puts forth lies. And as Dwight enters Toby’s life, Wolff observes that this minuscule mechanic tries too hard: “No eye is quicker to detect that kind of effort than the eye of a competitor who also happens to be a child.” But Dwight does have a family, including a daughter named Pearl with a prominent bald spot. And just as Rosemary sees possibility in volunteering for idealistic Democratic candidates, she sees an opportunity in Dwight. Much as W.D. Boyce being bowled over by a Boy Scout, effort is enough to plant an acorn for a dubious family tree. Meanwhile, Toby lets loose several “Fuck yous,” memorializing the message into a wall, and gets in trouble with the vice principal. When the vice principal meets with Rosemary, Toby is convinced of his innocence, not unlike Dwight, and the vice principal reveals his own systematic and sanctimonious story of how he quit smoking to buy a Nash Rambler, the very same rickety vehicle that brought Toby and Rosemary to the west.

It is here that the kernel for Dwight’s autocratic adoption of Toby begins to pop with a frenzy of fragile male ego: the belief that laborious effort, even on the most inconsequential acts, somehow makes one a respectable hard-working American. Toby is asked to pick up roadkill. He is asked to wait in a car as Dwight gets plastered in a bar. He is watched as Dwight fuels himself on tugs of Old Crow and Camels. He delivers newspapers and his earnings are pilfered by Dwight. He paints an old Baldwin piano to cover up its chintz. And he is commanded to pluck hard husks of horse chestnuts — a tyrannical tilling with some unspecified life lesson attached in which the product of all this hard labor is never actually used. When Toby gets into a fight with a kid named Arthur Gayle, Dwight coaches him on pugilism, claiming that any defeat is his fault. And throughout all this, there are the weekly Boy Scout meetings. Toby’s plan is to run away from Dwight’s home in Concrete, a Washington hamlet built on shaky slopes that Wolff describes as a graying and dusty landscape with cracking cement banks. It is, like many parts of America even today, a fraying tableau where too much effort gets in the way of existence, disguising the fissures of easily broken lives. One can almost imagine Dwight using the hashtag #MAGA on Twitter had he materialized decades later.

Whether this subjective truth-telling represents a kind of fearlessness or power in its own right is subject to the degree to which you are willing to embrace Wolff’s life story. But it does represent a refreshing alternative to the Horatio Alger grandstanding that too many personal essays wallow in today. (See, for example, most of the material published on Thought Catalog.) David Plotz once chided Dave Pelzer for turning child abuse into entertainment. This Boy’s Life avoids such petty voyeurism, in large part because it nestles Toby’s life and Dwight’s stark assaults by Dwight within the larger American dilemma of how to contend with fakery. And in an epoch where narcissistic dishonesty and “alternative facts” and social media outrage are increasingly the norm, there is a beautiful grace in putting your life out there and not giving a damn how others judge it.

G.H. Hardy was a mathematician ready to throw in the towel because he believed himself useless. Yet this small and depressing volume memorializing his many complaints may just have you finding hope and purpose as you argue with it.

Clocking in at a mere ninety pages in very large type, G.H. Hardy’s A Mathematician’s Apology is that rare canapé plucked from a small salver between all the other three-course meals and marathon banquets in the Modern Library series. It is a book so modest that you could probably read it in its entirety while waiting for the latest Windows 10 update to install. And what a bleak and despondent volume it turned out to be! I read the book twice and, each time I finished the book, I wanted to seek out some chalk-scrawling magician and offer a hug.

G.H. Hardy was a robust mathematician just over the age of sixty who had made some serious contributions to number theory and population genetics. He was a cricket-loving man who had brought the Indian autodidact Srinivasa Ramanujan to academic prominence by personally vouching for and mentoring him. You would think that a highly accomplished dude who went about the world with such bountiful and generous energies would be able to ride out his eccentric enthusiasm into his autumn years. But in 1939, Hardy survived a heart attack and felt that he was as useless as an ashtray on a motorcycle, possessing nothing much in the way of nimble acumen or originality. So he decided to memorialize his depressing thoughts about “useful” contributions to knowledge in A Mathematician’s Apology (in one of the book’s most stupendous understatements, Hardy observed that “my apology is bound to be to some extent egotistical”), and asked whether mathematics, the field that he had entered into because he “wanted to beat other boys, and this seemed to be the way in which I could do so most decisively,” was worthwhile.

You can probably guess how it all turned out:

It is indeed rather astonishing how little practical value scientific knowledge has for ordinary man, how dull and commonplace such of it as has value is, and how its value seems almost to vary inversely to reputed utility….We live either by rule of thumb or other people’s professional knowledge.

If only Hardy could have lived about sixty more years to discover the 21st century thinker’s parasitic relationship to Google and Wikipedia! The question is whether Hardy is right to be this cynical. While snidely observing “It is quite true that most people can do nothing well,” he isn’t a total sourpuss. He writes, “A man’s first duty, a young man’s at any rate, is to be ambitious,” and points out that ambition has been “the driving force behind nearly all the best work of the world.” What he fails to see, however, is that youthful ambition, whether in a writer or a scientist, often morphs into a set of routines that become second-nature. At a certain point, a person becomes comfortable enough with himself to simply go on with his work, quietly evolving, where the ambition becomes more covert and subconscious and mysterious.

Hardy never quite confronts what it is about youth that frightens him, but he is driven by a need to justify his work and his existence, pointing to two reasons for why people do what they do: (1) they work at something because they know they can do it well and (2) they work at something because a particular vocation or specialty came their way. But this seems too pat and Gladwellian to be a persuasive dichotomy. It doesn’t really account for the journey we all must face over why one does something, which generally includes the vital people you meet at certain places in your life who point you down certain directions. Either they recognize some talent in you and give you a leg up or they are smart and generous enough to recognize that one essential part of human duty is to help others find their way, to seek out your people — ideally a group of eclectic and vastly differing perspectives — and to work with each other to do the best damn work and live the best damn lives you can. Because what’s the point of geeking out about Fermat’s “two squares” theorem, which really is, as Hardy observes, a nifty mathematical axiom of pure beauty, if you can’t share it with others?

But let’s return to Hardy’s fixation on youth. Hardy makes the claim that “mathematics, more than any other art or science, is a young man’s game,” yet this staggering statement is easily debunked by such late bloomers as prime number ninja Zhang Yitang and Andrew Wiles solving Fermat’s Last Theorem at the age of 41. Even in Hardy’s own time, Henri Poincaré was making innovations to topology and Lorentz transformations well into middle age. (And Hardy explicitly references Poincaré in § 26 of his Apology. So it’s not like he didn’t know!) Perhaps some of the more recent late life contributions have much to do with forty now being the new thirty (or even the new twenty among a certain Jaguar-buying midlife crisis type) and many men in Hardy’s time believing themselves to be superannuated in body and soul around the age of thirty-five, but it does point to the likelihood that Hardy’s sentiments were less the result of serious thinking and more the result of crippling depression.

Where Richard Feynman saw chess as a happy metaphor for the universe, “a great game played by the gods” in which we humans are mere observers who “do not know what the rules of the game are,” merely allowed to watch the playing (and yet find marvel in this all the same), Hardy believed that any chess problem was “simply an exercise in pure mathematics…and everyone who calls a problem ‘beautiful’ is applauding mathematical beauty, even if is a beauty of a comparatively lowly kind.” Hardy was so sour that he compared a chess problem to a newspaper puzzle, claiming that it merely offered an “intellectual kick” for the clueless educated rabble. As someone who enjoys solving the Sunday New York Times crossword in full and a good chess game (it’s the street players I have learned the most from; for they often have the boldest and most original moves), I can’t really argue against Hardy’s claim that such pastimes are “trivial” or “unimportant” in the grand scheme of things. But Hardy seems unable to remember the possibly apocryphal tale of Archimedes discovering gradual displacement while in the bathtub or the more reliable story of Otto Loewi’s dream leading the great Nobel-winning physiologist to discover that nervous impulses arose from electrical transmissions. Great minds often need to be restfully thinking or active on other fronts in order to come up with significant innovations. And while Hardy may claim that “no chess problem has ever affected the development of scientific thought,” I feel compelled to note Pythagoras played the lyre (and even inspired a form of tuning), Newton had his meandering apple moment, and Einstein enjoyed hiking and sailing. These were undoubtedly “trivial” practices by Hardy’s austere standards, but would these great men have given us their contributions if they hadn’t had such downtime?

It’s a bit gobsmacking that Hardy never mentions how Loewi was fired up by his dreams. He seems only to see value in Morpheus’s prophecies if they are dark and melancholic:

I can remember Bertrand Russell telling me of a horrible dream. He was in the top floor of the University Library, about A.D. 2100. A library assistant was going round the shelves carrying an enormous bucket, taking down book after book, glancing at them, restoring them to the shelves or dumping them into the bucket. At last he came to three large volumes which Russell could recognize as the last surviving copy of Principia mathematica. He took down one of the volumes, turned over a few pages, seemed puzzled for a moment by the curious symbolism, closed the volume, balanced it in his hand and hesitated….

One of an author’s worst nightmares is to have his work rendered instantly obsolescent not long after his death, even though there is a very strong likelihood that, in about 150 years, few people will care about the majority of books published today. (Hell, few people care about anything I have to write today, much less this insane Modern Library project. There is a high probability that I will be dead in five decades and that nobody will read the many millions of words or listen to the countless hours of radio I have put out into the universe. It may seem pessimistic to consider this salient truth, but, if anything, it motivates me to make as much as I can in the time I have, which I suppose is an egotistical and foolishly optimistic approach. But what else can one do? Deposit one’s head in the sand, smoke endless bowls of pot, wolf down giant bags of Cheetos, and binge-watch insipid television that will also not be remembered?) You can either accept this reality and reach the few people you can and find happiness and gratitude in doing so. Or you can deny the clear fact that your ego is getting in the way of your achievements, embracing supererogatory anxieties and forcing you to spend too much time feeling needlessly morose.

I suppose that in articulating this common neurosis, Hardy is performing a service. He seems to relish “mathematical fame,” which he calls “one of the soundest and steadiest of investments.” Yet fame is a piss-poor reason to go about making art or formulating theorems. Most of the contributions to human advancement are rendered invisible. These are often small yet subtly influential rivulets that unknowingly pass into the great river that future generations will wade in. We fight for virtues and rigor and intelligence and truth and justice and fairness and equality because this will be the legacy that our children and grandchildren will latch onto. And we often make unknowing waves. Would we, for example, be enjoying Hamilton today if Lin-Manuel Miranda’s school bus driver had not drilled him with Geto Boys lyrics? And if we capitulate those standards, if we gainsay the “trivial” inspirations that cause others to offer their greatness, then we say to the next generation, who are probably not going to be listening to us, that fat, drunk, and stupid is the absolute way to go through life, son.

A chair may be a collection of whirling electrons, or an idea in the mind of God: each of these accounts of it may have its merits, but neither conforms at all closely to the suggestions of common sense.

This is Hardy suggesting some church and state-like separation between pure and applied mathematics. He sees physics as fitting into some idealistic philosophy while identifying pure mathematics as “a rock on which all idealism flounders.” But might not one fully inhabit common sense if the chair exists in some continuum beyond this either-or proposition? Is not the chair’s perceptive totality worth pursuing?

It is at this point in the book where Hardy’s argument really heads south and he makes an astonishingly wrongheaded claim, one that he could not have entirely foreseen, noting that “Real mathematics has no effects on war.” This was only a few years before Los Alamos was to prove him wrong. And that’s not all:

It can be maintained that modern warfare is less horrible than the warfare of pre-scientific times; that bombs are probably more merciful than bayonets; that lachrymatory gas and mustard gas are perhaps the most humane weapons yet devised by military science; and that the orthodox view rests solely on loose-thinking sentimentalism.

Oh Hardy! Hiroshima, Nagasaki, Agent Orange, Nick Ut’s famous napalm girl photo from Vietnam, Saddam Hussein’s chemical gas massacre in Halabja, the use of Sarin-spreading rockets in Syria. Not merciful. Not humane. And nothing to be sentimental about!

Nevertheless, I was grateful to argue with this book on my second read, which occurred a little more than two weeks after the shocking 2016 presidential election. I had thought myself largely divested of hope and optimism, with the barrage of headlines and frightening appointments (and even Trump’s most recent Taiwan call) doing nothing to summon my natural spirits. But Hardy did force me to engage with his points. And his book, while possessing many flawed arguments, is nevertheless a fascinating insight into a man who gave up: a worthwhile and emotionally true Rorschach test you may wish to try if you need to remind yourself why you’re still doing what you’re doing.

Was Richard Feynman the “brilliant teacher” that his champions made him out to be? I was somewhat underwhelmed by this curated selection of lectures, but I did still come away from Feynman wanting to know more about how science has altered human existence.

Richard Feynman, exuberant Nobel laureate and formidable quantum mechanics man, may have been energetic in his lectures and innovatively performative in the classroom, but I’m not sure he was quite the great teacher that many have pegged him to be. James Gleick’s biography Genius informs us that students dropped out of his high-octane, info-rich undergraduate physics classes at a remarkable rate, replaced by Caltech faculty members and grad students who took to the Queens-born superstar much like baryons make up the visible matter of the universe. The extent to which Feynman was aware of this cosmic shift has been disputed by his chroniclers, but it is important to be aware of this shortcoming, especially if you’re bold enough to dive into the famed three volume Feynman Lectures on Physics, which are all thankfully available online. Six Easy Pieces represents an abridged version of Feynman’s full pedagogical oeuvre. And even though the many YouTube videos of Feynman reveal an undeniably magnetic and indefatigably passionate man of science who must have been an incredible dynamo to experience in person, one wonders whether barraging a hot room of young nervous twentysomethings with hastily delivered information is the right way to popularize science, much less inspire a formidable army of physicists.

Watch even a few minutes of Feynman firing on all his robust cylinders and it becomes glaringly apparent how difficult it is to contend with Feynman’s teaching legacy in book form. One wonders why the Modern Library nonfiction judges, who were keen to unknowingly bombard this devoted reader with such massive multivolume works as The Golden Bough, Dumas Malone’s Jefferson and His Time, and Principia Mathematica, didn’t give this spot to the full three volume Lectures. Did they view Feynman’s complete lesson plan as failed?

Judging from the sextet that I sampled in this deceptively slim volume, I would say that, while Feynman was undeniably brilliant, he was, like many geniuses, someone who often got lost within his own metaphors. While his analogy of two corks floating in a pool of water, with one cork jiggling in place to create motion in the pool that causes indirect motion for the other cork, is a tremendously useful method of conveying the “unseen” waves of the electromagnetic field (one that galvanized me to do the same in a saucepan after I had finished two bottles of wine over a week and a half), he is not nearly on-the-nose with his other analogies. The weakest lesson in the book, “Conservation on Energy,” trots out what seems to be a reliably populist metaphor with a child named “Dennis the Menace” playing with 28 blocks, somehow always ending up with 28 of these at the end of the day. Because Feynman wants to illustrate conservational constants, he shoehorns another element to the narrative whereby Dennis’s mother is, for no apparent reason, not allowed to open up the toy box revealing the number of blocks and thus must calculate how many blocks reside within. The mother has conveniently weighed the box at some unspecified time in advance back when it contained all 28 blocks.

This is bad teaching, in large part because it is bad storytelling that makes no sense. I became less interested in conservation of energy, with Feynman’s convoluted parallel clearly becoming more trouble than it was worth, and more interested in knowing why the mother was so fixated on remembering the number of blocks. Was she truly so starved for activity in her life that she spent all day at work avoiding all the juicy water cooler gossip about co-workers, much less kvetching about the boss, so that she might scheme a plan to at long last show her son that she would always know the weight of a single block? When Dennis showed resistance to opening the toy box, why didn’t the mother stand her ground and tell him to buzz off and stream an episode of Project Mc²?

Yet for all these defects in method, there is an indisputable poetic beauty in the way in which Feynman reminds us that we live in a vast world composed of limitless particles, a world in which we still aren’t aware of all the rules and in which even the particles contained within solids remain “fixed” in motion. Our universe is always moving, even when we can’t see it or completely comprehend it. Feynman is quick to observe throughout his lessons that “The test of all knowledge is experiment,” which again points to my theory that Feynman’s teachings, often accentuated by experiment, were probably better experienced than read. Nevertheless, even in book form, it is truly awe-inspiring to understand that we can still not accurately predict the precise mass, form, and force of all the cascading droplets from a mighty river once it hits the precipice of a waterfall. Such mysteries capture our imagination and, when Feynman is committed to encouraging our inventiveness through open and clear-eyed examples from our world, he is very much on point. Thanks in part to Feynman reminding me just how little we silly humans now know, I began to feel my heart open more for Tycho Brahe, that poor Dane who spent many years of his life refining Copernicus’s details and determining the elliptical patterns of planetary orbits. Brahe worked out his calculations entirely without a telescope, which allowed Johannes Kepler to sift through his invaluable measurements and forge laws that all contemporary astronomers now rely on to determine where a planet might be in the sky on any given night of the year. Heisenberg’s uncertainty principle hasn’t even been around a century and it’s nothing less than astounding to consider how our great grandparents had a completely different understanding of atoms and motion in their early lifetime than we do today.

Feynman did have me wanting to know more about the origins of many scientific discoveries, causing me to contemplate how each and every dawning realization altered human existence (an inevitable buildup for Thomas Kuhn and paradigms, which I will take up in ML Nonfiction #69). But unlike such contemporary scientists as Neil deGrasse Tyson, Alan Guth, or Brian Greene, Feynman did not especially inspire me to plunge broadly into my own experiments or make any further attempts to grapple with physics-based complexities. This may very well be more my failing than Feynman’s, but there shall be many more stabs at science as we carry on with this massive reading endeavor!

Annie Dillard’s poetic masterpiece still offers compelling reasons to take in the hidden wonders that surround us. This pivotal volume emboldened and inspired me and may just knock you into a new state of consciousness.

“Either this world, my mother, is a monster, or I myself am a freak.” — Annie Dillard, Pilgrim at Tinker Creek

I was a sliver-thin, stupefyingly shy, and very excitable boy who disguised his bruises under the long sleeves of his shirt not long before the age of five. I was also a freak.

I had two maps pinned to the wall of my drafty bedroom, which had been hastily constructed into the east edge of the garage in a house painted pink (now turquoise, according to Google Maps). The first map was of Tolkien’s Middle-earth, in which I followed the quests of Bilbo and Frodo by finger as I wrapped my precocious, word-happy head around sentences that I’d secretly study from the trilogy I had purloined from the living room, a well-thumbed set that I was careful to put back to the shelves before my volatile and often sour father returned home from the chemical plant. In some of his rare calm moments, my father read aloud from The Lord of the Rings if he wasn’t too drunk, irascible, or violent. His voice led me to imagine Shelob’s thick spidery thistles, Smeagol’s slithering corpus, and kink open my eyes the next morning for any other surprises I might divine in my daily journeys to school. The second map was of Santa Clara County, a very real region that everyone now knows as Silicon Valley but that used to be a sweeping swath of working and lower middle-class domiciles. This was one of several dozen free maps of Northern California that I had procured from AAA with my mother’s help. One of the nice perks of being an AAA member was the ample sample of rectangular geographical foldouts. I swiftly memorized all of the streets, held spellbound by the floral and butterfly patterns of freeway intersections seen from a majestic bird’s eye view in an errant illustrated sky. My mother became easily lost while driving and I knew the avenues and the freeways in more than a dozen counties so well that I could always provide an easy cure for her confusion. It is a wonder that I never ended up working as a cab driver, although my spatial acumen has remained so keen over the years that, to this day, I can still pinpoint the precise angle in which you need to slide a thick unruly couch into the tricky recesses of a small Euclidean-angled apartment even when I am completely exhausted.

These two maps seemed to be the apotheosis of cartographic art at the time, filling me with joy and wonder and possibility. It helped me cope with the many problems I lived with at home. I understood that there were other regions beyond my bedroom where I could wander in peace, where I could meet kinder people or take in the beatific comforts of a soothing lake (Vasona Lake, just west of Highway 17 in Los Gatos, had a little railroad spiraling around its southern tip and was my real-life counterpart to Lake Evendim), where the draw of Rivendell’s elvish population or the thrill of smoky Smaug stewing inside the Lonely Mountain collided against visions of imagined mountain dwellers I might meet somewhere within the greens and browns of Santa Teresa Hills and the majestic observatories staring brazenly into the cosmos at the end of uphill winding roads. I would soon start exploring the world I had espied from my improvised bedroom study on my bike, pedaling unfathomable miles into vicinities I had only dreamed about, always seeking parallels to what the Oxford professor had whipped up. I once ventured as far south as Gilroy down the Monterey Highway, which Google Maps now informs me is a thirty-six mile round trip, because my neglectful parents never kept tabs on how long I was out of the house or where I was going. They didn’t seem to care. As shameful as this was, I’m glad they didn’t. I needed an uncanny dominion, a territory to flesh out, in order to stay happy, humble, and alive.

The maps opened up my always hungry eyes to books, which contained equally bountiful spaces devoted to the real and the imaginary, unspooling further marks and points for me to find in the palpable world and, most importantly, within my heart. I always held onto this strange reverence for place to beat back the sadness after serving as my father’s punching bag. To this day, I remain an outlier, a nomad, a lifelong exile, a wanderer even as I sit still, a renegade hated for what people think I am, a black sheep who will never belong no matter how kind I am. I won’t make the mistake of painting myself as some virtuous paragon, but I’ve become so accustomed to being condemned on illusory cause, to having all-too-common cruelties inflicted upon me (such as the starry-eyed bourgie Burning Man sybarite I recently opened my heart to, who proceeded to deride the city that I love, along with the perceived deficiencies of my hard-won apartment, this after I had told her tales, not easily summoned, about what it was like to be rootless and without family and how home and togetherness remain sensitive subjects for me) that the limitless marvels of the universe parked in my back pocket or swiftly summoned from my shelves or my constant peregrinations remain reliable, life-affirming balms that help heal the scars and render the wounds invisible. Heartbreak and its accompanying gang of thugs often feel like a mob bashing in your ventricles in a devastatingly distinct way, even though the great cosmic joke is that everyone experiences it and we have to love anyway.

So when Annie Dillard’s poetic masterpiece Pilgrim at Tinker Creek entered my reading life, its ebullient commitment to finding grace and gratitude in a monstrous world reminded me that seeing and perceiving and delving and gaping awestruck at Mother Earth’s endless glories seemed to me one one of the best survival skills you can cultivate and that I may have accidentally stumbled upon. As I said, I’m a freak. But Dillard is one too. And there’s a good chance you may walk away from this book, which I highly urge you to read, feeling a comparable kinship, as I did to Dillard. Even if you already have a formidable arsenal of boundless curiosity ready to be summoned at a moment’s notice, this shining 1974 volume remains vital and indispensable and will stir your soul for the better, whether you’re happy or sad. Near the end of a disastrous year, we need these inspirational moments now more than ever.

* * *

“Our life is a faint tracing on the surface of mystery.” – Pilgrim at Tinker Creek

Annie Dillard was only 28 when she wrote this stunning 20th century answer to Thoreau (the subject of her master’s thesis), which is both a perspicacious journal of journeying through the immediately accessible wild near her bucolic Southwestern Virginia perch and a daringly honest entreaty for consciousness and connection. Dillard’s worldview is so winningly inclusive that she can find wonder in such savage tableaux as a headless praying mantis clutching onto its mate or the larval creatures contained within a rock barnacle. The Washington Post claimed not long after Pilgrim‘s publication that the book was “selling so well on the West Coast and hipsters figure Annie Dillard’s some kind of female Castaneda, sitting up on Dead Man’s Mountain smoking mandrake roots and looking for Holes in the Horizon her guru said were there.” But Pilgrim, inspired in part from Colette’s Break of Day, is far from New Age nonsense. The book’s wise and erudite celebration of nature and spirituality was open and inspiring enough to charm even this urban-based secular humanist, who desperately needed a pick-me-up and a mandate to rejoin the world after a rapid-fire series of personal and political and romantic and artistic setbacks that occurred during the last two weeks.

For all of the book’s concerns with divinity, or what Dillard identifies as “a divine power that exists in a particular place, or that travels about over the face of the earth as a man might wander,” explicit gods don’t enter this meditation until a little under halfway through the book, where she points out jokingly how gods are often found on mountaintops and points out that God is an igniter as well as a destroyer, one that seeks invisibility for cover. And as someone who does not believe in a god and who would rather deposit his faith in imaginative storytelling and myth rather than the superstitions of religious ritual, I could nevertheless feel and accept the spiritual idea of being emotionally vulnerable while traversing into some majestic terrain. Or as Pascal wrote in Pensées 584 (quoted in part by Dillard), “God being thus hidden, every religion which does not affirm that God is hidden, is not true, and every religion which does not give the reason of it, is not instructive.”

Much of this awe comes through the humility of perceiving, of devoting yourself to sussing out every conceivable kernel that might present itself and uplift you on any given day and using this as the basis to push beyond the blinkered cage of your own self-consciousness. Dillard uses a metaphor of loose change throughout Pilgrim that neatly encapsulates this sentiment:

It is dire poverty indeed when a man is so malnourished and fatigued that he won’t stoop to pick up a penny. But if you cultivate a healthy poverty and simplicity, so that finding a penny will literally make your day, then, since the world is in fact planted in pennies, you have with your poverty bought a lifetime of days. It is that simple. What you see is what you get.

This is not too far removed from Thoreau’s faith in seeds: “Convince me that you have a seed there, and I am prepared to expect wonders.” The smug and insufferable Kathryn Schulzes of our world gleefully misread this great tradition of discovering possibilities in the small as arrogance, little realizing how their own blind and unimaginative hubris glows with crass Conde Nast entitlement as they fail to observe that Thoreau and Dillard were also acknowledging the ineluctable force of a bigger and fiercer world that will carry on with formidable complexity long after our dead bodies push against daisies. Faced with the choice of sustaining a sour Schulz-like apostasy or receiving every living day as a gift, I’d rather risk the arrogance of dreaming from the collected riches of what I have and what I can give rather than the gutless timidity of a prescriptive rigidity that fails to consider that we are all steeped in foolish and inconsistent behavior which, in the grand scheme of things, is ultimately insignificant.

Dillard is guided just as much by Heisenberg’s uncertainty principle as she is by religious and philosophical texts. The famous 1927 scientific law, which articulates how you can never know a particle’s speed and velocity at the same time, is very much comparable to chasing down some hidden deity or contending with some experiential palpitations when you understand that there simply is no answer, for one can feel but never fully comprehend the totality in a skirmish with Nature. It accounts for Dillard frequently noting that the towhee chirping on a treetop or the muskrat she observes chewing grass on a bank for forty minutes never see her. In seeing these amazing creatures carry on with their lives, who are completely oblivious to her own human vagaries, Dillard reminds us that this is very much the state of Nature, whether human or animal. If it is indeed arrogance to find awe and humility in this state of affairs, as Dillard and Thoreau clearly both did, then one’s every breath may as well be a Napoleonic puff of the chest.

Dillard is also smart and expansive enough to show us that, no matter where we reside, we are fated to brush up against the feral. She points to how arboreal enthusiasts in the Lower Bronx discovered a fifteen feet ailanthus tree growing from a lower Bronx garage and how New York must spend countless dollars each year to rid its underground water pipes of roots. Such realities are often contended with out of sight and out of mind, even as the New York apartment dweller battles cockroaches, but the reminder is another useful point for why we must always find the pennies and dare to dream and wander and take in, no matter what part of the nation we dwell in.

Another refreshing aspect of Pilgrim is the way in which Dillard confronts her own horrors with fecundity. Yes, even this graceful ruminator has the decency to confess her hangups about the unsettling rapidity with which moths lay their eggs in vast droves. She stops short at truly confronting “the pressure of birth and growth” that appalls her, shifting to plants as a way of evading animals and then retreating back to the blood-pumping phylum to take in blood flukes and aphid reproduction more as panorama rather than something to be felt. This volte-face isn’t entirely satisfying. On the other hand, Dillard is also bold enough to scoop up a cup of duck-pond water and peer at monostyla under a microscope. What this tells us is that there are clear limits to how far any of us are willing to delve, yet I cannot find it within me to chide Dillard too harshly for a journey she was not quite willing to take, for this is an honest and heartfelt chronicle.

While I’ve probably been “arrogant” in retreating at length to my past in an effort to articulate how Dillard’s book so moved me, I would say that Pilgrim at Tinker Creek represents a third map for my adult years. It is a true work of art that I am happy to pin to the walls of my mind, which seems more reliable than any childhood bedroom. This book has caused me to wonder why I have ignored so much and has demanded that that I open myself up to any penny I could potentially cherish and to ponder what undiscoverable terrain I might deign to take in as I continue to walk this earth. I do not believe in a god, but I do feel with all my heart that one compelling reason to live is to fearlessly approach all that remains hidden. There is no way that you’ll ever know or find everything, but Dillard’s magnificent volume certainly gives you many good reasons to try.

When I first made my bold belly flop into the crisp waters of Ralph Ellison’s deep pool earlier this year, I felt instantly dismayed that it would be a good decade before I could perform thoughtful freestyle in response to his masterpiece Invisible Man (ML Fiction #19). As far as I’m concerned, that novel’s vivid imagery, beginning with its fierce and intensely revealing Battle Royale scene and culminating in its harrowing entrapment of the unnamed narrator, stands toe-to-toe with Saul Bellow’s The Adventures of Augie March as one of the most compelling panoramas of mid-20th century American life ever put to print, albeit one presented through a more hyperreal lens.

But many of today’s leading writers, ranging from Ta-Nehisi Coates to Jacqueline Woodson, have looked more to James Baldwin as their truth-telling cicerone, that fearless sage whose indisputably hypnotic energy was abundant enough to help any contemporary humanist grapple with the nightmarish realities that America continues to sweep under its bright plush neoliberal rug. At a cursory glance, largely because Ellison’s emphasis was more on culture than overt politics, it’s easy to see Ellison as a complacent “Maybe I’m Amazed” to Baldwin’s gritty “Cold Turkey,” especially when one considers the risk-averse conservatism which led to Ellison being attacked as an Uncle Tom during a 1968 panel at Grinnell College along with his selfish refusal to help emerging African-American authors after his success. But according to biographer Arnold Rampersad, Baldwin believed Ralph Ellison to be the angriest person he knew. And if one dives into Ellison’s actual words, Shadow and Act is an essential volume, which includes one of the most thrilling Molotov cocktails ever pitched into the face of a clueless literary critic, that is often just as potent and as lapel-grabbing as Baldwin’s The Fire Next Time.

For it would seem that while Negroes have been undergoing a process of “Americanization” from a time preceding this birth of this nation — including the fusing of their blood lines with other non-African strains, there has been a stubborn confusion as to their American identity. Somehow it was assumed that the Negroes, of all the diverse American peoples, would remain unaffected by the climate, the weather, the political circumstances — from which not even slaves were exempt — the social structures, the national manners, the modes of production and the tides of the market, the national ideals, the conflicts of values, the rising and falling of national morale, or the complex give and take of acculturalization which was undergone by all others who found their existence within the American democracy.

Ellison, however, was too smart and too wide of a reader to confine these sins of dehumanization to their obvious targets. Like Baldwin and Coates and Richard Wright, Ellison looked to France for answers and, while never actually residing there, he certainly counted André Malraux and Paul Valéry among his hard influences. In writing about Richard Wright’s Black Boy, Ellison wisely singled out critics who failed to consider the full extent of African-American humanity even as they simultaneously demanded an on-the-nose and unambiguous “explanation” of who Wright was. (And it’s worth noting that Ellison himself, who was given his first professional writing gig by Wright, was also just as critical of Wright’s ideological propositions as Baldwin.) Ellison described how “the prevailing mood of American criticism has so thoroughly excluded the Negro that it fails to recognize some of the most basic tenets of Western democratic thought when encountering them in a black skin” and deservedly excoriated whites for seeing Paul Robeson and Marian Anderson merely as the ne plus ultra of African-American artistic innovation rather than the beginning of a great movement.

At issue, in Ellison’s time and today, is the degree to which any individual voice is allowed to express himself. And Ellison rightly resented any force that would stifle this, whether it be the lingering dregs of Southern slavery telling the African-American how he must act or who he must be in telling his story as well as the myopic critics who would gainsay any voice by way of their boxlike assumptions about other Americans. One sees this unthinking lurch towards authoritarianism today with such white supremacists as Jonathan Franzen, Lionel Shriver, and the many Brooklyn novelists who, despite setting their works in gentrified neighborhoods still prominently populated by African-Americans, fail to include, much less humanize, black people who still live there.

“White supremacist” may seem like a harshly provocative label for any bumbling white writer who lacks the democratic bonhomie to leave the house and talk with other people and consider that those who do not share his skin color may indeed share more common experience than presumed. But if these writers are going to boast about how their narratives allegedly tell the truth about America while refusing to accept challenge for their gaping holes and denying the complexity of vital people who make up this great nation, then it seems apposite to bring a loaded gun to a knife fight. If we accept Ellison’s view of race as “an irrational sea in which Americans flounder like convoyed ships in a gale,” then it is clear that these egotistical, self-appointed seers are buckling on damaged vessels hewing to shambling sea routes mapped out by blustering navigators basking in white privilege, hitting murky ports festooned with ancient barnacles that they adamantly refuse to remove.

Franzen, despite growing up in a city in which half the population is African-American, recently told Slate‘s Isaac Chotiner that he could not countenance writing about other races because he has not loved them or gone out of his way to know them and thus excludes non-white characters from his massive and increasingly mediocre novels. Shriver wrote a novel, The Mandibles, in which the only black characters are (1) Leulla, bound to a chair and walked with a leash, and (2) Selma, who speaks in a racist Mammy patois (“I love the pitcher of all them rich folk having to cough they big piles of gold”). She then had the effrontery to deliver a keynote speech at the Brisbane Writers Festival arguing for the right to “try on other people’s hats,” failing to understand that creating dimensional characters involves a great deal more than playing dress-up at the country club. She quoted from a Margot Kaminski review of Chris Cleave’s Little Bee that offered the perfectly reasonable consideration, one that doesn’t deny an author’s right to cross boundaries, that an author may wish to take “special care…with a story that’s not implicitly yours to tell.” Such forethought clearly means constructing an identity that is more human rather than crassly archetypal, an eminently pragmatic consideration on how any work of contemporary art should probably reflect the many identities that make up our world. But for Shriver, a character should be manipulated at an author’s whim, even if her creative vagaries represent an impoverishment of imagination. For Shriver, inserting another nonwhite, non-heteronormative character into The Mandibles represented “issues that might distract from my central subject moment of apocalyptic economics.” Which brings us back to Ellison’s question of “Americanization” and how “the diverse American peoples” are indeed regularly affected by the decisions of those who uphold the status quo, whether overtly or covertly.

Writer Maxine Benba-Clarke bravely confronted Shriver with the full monty of this dismissive racism and Shriver responded, “When I come to your country. I expect. To be treated. With hospitality.” And with that vile and shrill answer, devoid of humanity and humility, Shriver exposed the upright incomprehension of her position, stepping from behind the arras as a kind of literary Jan Smuts for the 21st century.1To this litany of country bumpkin calumnies from a Literary Establishment that purports to Know All, one must also include the superficial contributions of the wildly overrated Joyce Carol Oates, whose novel The Sacrifice demonstrated equally ignoble failings to consider complex identity and authenticity in fictionalizing the Tawana Brawley case. These inadequacies were smartly called out by Roxane Gay in The New York Times Book Review. Rather than consider how she had failed as an author, Oates, true to the clueless xenophobia she regularly chirrups on Twitter, suggested that Gay had failed.

If this current state of affairs represents a bristling example of Giambattista Vico’s corsi e ricorsi, and I believe it does, then Ellison’s essay, “Twentieth-Century Fiction and the Black Mask of Humanity,” astutely demonstrates how this cultural amaurosis went down before, with 20th century authors willfully misreading Mark Twain, failing to see that Huck’s release of Jim represented a moment that not only involved recognizing Jim as a human being, but admitting “the evil implicit in his ’emancipation'” as well as Twain accepting “his personal responsibility in the condition of society.” With great creative power comes great creative responsibility. Ellison points to Ernest Hemingway scouring The Adventures of Huckleberry Finn merely for its technical accomplishments rather than this moral candor and how William Faulkner, despite being “the greatest artist the South has produced,” may not be have been quite the all-encompassing oracle, given that The Unvanquished‘s Ringo is, despite his loyalty, devoid of humanity. In another essay on Stephen Crane, Ellison reaffirms that great art involves “the cost of moral perception, of achieving an informed sense of life, in a universe which is essentially hostile to man and in which skill and courage and loyalty are virtues which help in the struggle but by no means exempt us from the necessary plunge into the storm-sea-war of experience.” And in the essays on music that form the book’s second section (“Sound and the Mainstream”), Ellison cements this ethos with his personal experience growing up in the South. If literature might help us to confront the complexities of moral perception, then the lyrical, floating tones of a majestic singer or a distinctive cat shredding eloquently on an axe might aid us in expressing it. And that quest for authentic expression is forever in conflict with audience assumptions, as seen with such powerful figures as Charlie Parker, whom Ellison describes as “a sacrificial figure whose struggles against personal chaos…served as entertainment for a ravenous, sensation-starved, culturally disoriented public which had but the slightest notion of its real significance.”

What makes Ellison’s demands for inclusive identity quite sophisticated is the vital component of admitting one’s own complicity, an act well beyond the superficial expression of easily forgotten shame or white guilt that none of the 20th or the 21st century writers identified here have had the guts to push past. And Ellison wasn’t just a writer who pointed fingers. He held himself just as accountable, as seen in a terrific 1985 essay called “An Extravagance of Laughter” (not included in Shadow and Act, but found in Going with the Territory), in which Ellison writes about how he went to the theatre to see Jack Kirkland’s adaptation of Erskine Caldwell’s Tobacco Road. (I wrote about Tobacco Road in 2011 as part of this series and praised the way that this still volatile novel pushes its audience to confront its own prejudices against the impoverished through remarkably flamboyant characters.) Upon seeing wanton animal passion among poor whites on the stage, Ellison burst into an uncontrollable paroxysm of laughter, which emerged as he was still negotiating the rituals of New York life shortly after arriving from the South. Ellison compared his reaction, which provoked outraged leers from the largely white audience, to an informal social ceremony he observed while he was a student at Tuskegee that involved a set of enormous whitewashed barrels labeled FOR COLORED placed in public space. If an African-American felt an overwhelming desire to laugh, he would thrust his head into the pit of the barrel and do so. Ellison observes that African-Americans “who in light of their social status and past condition of servitude were regarded as having absolutely nothing in their daily experience which could possibly inspire rational laughter.” And the expression of this inherently human quality, despite being a cathartic part of reckoning with identity and one’s position in the world, was nevertheless positioned out of sight and thus out of mind.

When I took an improv class at UCB earlier this year, I had an instructor who offered rather austere prohibitions to any strain of humor considered “too dark” or “punching down,” which would effectively disqualify both Tobacco Road and the Tuskegee barrel ritual that Ellison describes.2Earlier this year, in The Baffler, Ben Schwartz offered some cogent and provocative thoughts on the problems with such ultimatums. His essay is very worth reading if you’re interested in painting outside the lines as an artist or a thinker. These restrictions greatly frustrated me and a few of my classmates, who didn’t necessarily see the exploration of edgy comic terrain as a default choice, but merely one part of asserting an identity inclusive of many perspectives. I challenged the notion of confining behavior to obvious choices and ended up getting a phone call from the registrar, who was a smart and genial man and with whom I ended up having a friendly and thoughtful volley about comedy. I had apparently been ratted out by one student, who claimed that I was “disrupting” the class when I was merely inquiring about my own complicity in establishing base reality. In my efforts to further clarify my position, I sent a lengthy email to the instructor, one referencing “An Extravagance of Laughter,” and pointed out that delving into the uncomfortable was a vital part of reckoning with truth and ensuring that you grew your voice and evolved as an artist. I never received a reply. I can’t say that I blame him.

Ellison’s inquiry into the roots of how we find common ground with others suggests that we may be able to do so if we (a) acknowledge the completeness of other identities and (b) allow enough room for necessary catharsis and the acknowledgment of our feelings and our failings as we take baby steps towards better understanding each other.

The most blistering firebomb in the book is, of course, the infamous essay “The World and the Jug,” which demonstrates just what happens when you assume rather than take the time to know another person. It is a refreshingly uncoiled response that one could not imagine being published in this age of “No haters” reviewing policies and genial retreat from substantive subjects in today’s book review sections. Reacting to Irving Howe’s “Black Boys and Native Sons,” Ellison condemns Howe for not seeing “a human being but an abstract embodiment of living hell” and truly hammers home the need for all art to be considered on the basis of its human experience rather than the spectator’s constricting inferences. Howe’s great mistake was to view all African-American novels through the prism of a “protest novel” and this effectively revealed his own biases against what black writers had to say and very much for certain prerigged ideas that Howe expected them to say. “Must I be condemned because my sense of Negro life was quite different?” writes Ellison in response to Howe roping him in with Richard Wright and James Baldwin. And Ellison pours on the vinegar by not only observing how Howe self-plagiarized passages from previous reviews, but how his intractable ideology led him to defend the “old-fashioned” violence contained in Wright’s The Long Dream, which, whatever its merits, clearly did not keep current with the changing dialogue at the time.

Shadow and Act, with its inclusion of interviews and speeches and riffs on music (along with a sketch of a struggling mother), may be confused with a personal scrapbook. But it is, first and foremost, one man’s effort to assert his identity and his philosophy in the most cathartic and inclusive way possible. We still have much to learn from Ellison more than fifty years after these essays first appeared. And while I will always be galvanized by James Baldwin (who awaits our study in a few years), Ralph Ellison offers plentiful flagstones to face the present and the future.

SUPPLEMENT: One of the great mysteries that has bedeviled Ralph Ellison fans for decades is the identity of the critic who attacked Invisible Man as a “literary race riot.” In a Paris Review interview included in Shadow and Act, Ellison had this to say about the critic:

But there is one widely syndicated critical bankrupt who made liberal noises during the thirties and has been frightened ever since. He attacked my book as a “literary race riot.”

With the generous help of Ellison’s biographer Arnold Rampersad (who gave me an idea of where the quote might be found in an email volley) and the good people at the New York Public Library, I have tracked down the “widely syndicated critical bankrupt” in question.

His name is Sterling North, best known for the children’s novel Rascal in 1963. He wrote widely popular (and rightly forgotten) children’s books while writing book reviews for various newspapers. North was such a vanilla-minded man that he comics “a poisonous mushroom growth” and seemed to have it in for any work of art that dared to do something different — or that didn’t involve treacly narratives involving raising baby raccoons.

And then, in the April 16, 1952 issue of the New York World-Telegram, he belittled Ellison’s masterpiece, writing these words:

This is one of the most tragic and disturbing books I have ever read. For the most part brilliantly written and deeply sincere, it is, at the same time, bitter, violent and unbalanced. Except for a few closing pages in which the author tries to express something like a sane outlook on race relations, it is composed largely of such scenes of interracial strife that it achieves the effect of one continuous literary race riot. Ralph Ellison is a Negro with almost as much writing talent as Richard Wright. Like his embittered hero (known only as “I’ throughout the book, Mr. Ellison received scholarships to help him through college, one from the State of Oklahoma which made possible three years at the Tuskegee Institute, and one from the Rosenwald Foundation.

If Mr. Ellison is as scornful and bitter about this sort of assistance as he lets his “hero” be, those who made the money available must wonder if it was well spent.

North’s remarkably condescending words offer an alarming view of the cultural oppression that Ellison was fighting against and serve as further justification for Ellison’s views in Shadow and Act. Aside from his gross mischaracterization of Ellison’s novel, there is North’s troubling assumptions that Ellison should be grateful in the manner of an obsequious and servile stereotype, only deserves a scholarship if he writes a novel that fits North’s limited idea of what African-American identity should be, and that future white benefactors should think twice about granting opportunities for future uppity Ellisons.

It’s doubtful that The Sterling North Society will recognize this calumny, but this is despicable racism by any measure. A dive into North’s past also reveals So Dear to My Heart, a 1948 film adaptation of North’s Midnight and Jeremiah that reveled in Uncle Tom representations of African-Americans.

Sunset Park is a cozy part of Brooklyn trilling with children making midday escapes from big brick schools, with a few old factories that wail great threnodies whenever the moon winks a ditty about displaced residents on a cloudy night. There are robust workers and tight-knit families and bahn mi bistros and bustling bakeries from which one can savor the tantalizing nectar of glorious Spanish gossip squeezing into the streets. If you are tipsy after too many pints at the Irish pubs lining the southwestern fringe, there are 24 hour donut shops serving as makeshift diners, with loquacious jacks cooking up chorizo hash for any hungry ghost in a fix.

This is the region, along with East New York and Flatlands and Bensonhurst, where Brooklyn’s true soul still shines. It remains insulated from the Williamsburg hipsters oblivious to the high rise monstrosities now being hoisted near the East River or the yuppies who cleave to Park Slope’s gluten-free stroller war zone like children keeping to the shallow end of the pool. But the motley banter rivals the bright babble bubbling five miles east in Ditmas Park and even the chatty ripples that percolate just two miles south in Bay Ridge. In Sunset Park, you can pluck the city’s most enormous plantains from bold bodega bins bulging with promise, talk to the last honest bartender at Brooklyn’s best bowling alley, or walk beneath a Buddhist temple for some of the finest vegetarian Chinese grub in the region. It is a place of repose. It is a place of fun. It is a place to live.

Yet as great and as welcoming and as improbably enduring as this part of Brooklyn is, it could have been bigger. And for a long time, it was. Until Robert Moses came along.

There are many grim tales contained within Robert A. Caro’s The Power Broker — an alarmingly large and exquisitely gripping and undeniably great and insanely obsessive masterpiece of journalism documenting the most ruthless urban planner that New York, and possibly America, has ever known. If you love New York City even one tenth as much as I do, you will find many reasons to shout obscenities out your window after reading about what Robert Moses did to this mighty metropolis. It was Moses who killed off the free aquarium, open to all, that once stood in Battery Park. It was Moses who pitted reliable mass transit lines serving regular Janes and Joes against highways designed solely for those who had the shekels to buy and upkeep a car. It was Moses who believed African-Americans to be “dirty” and who, in building Riverside Park, stiffed the Harlem section of playgrounds (seventeen in the West Side; one in Harlem) and football fields (five to one). Moses was so casually racist that most of the parks he built, the parks that secured his popularity, served white middle-class New Yorkers. But working-class families needed these parks more and were often reduced to opening a fire hydrant in the streets and playing in the gutter during a hot summer.

Not a single person in power will ever change the Manhattan skyline in the irreversible way that Moses did. Robert Moses had massive ambition, savvy savagery, limitless arrogance and energy, improbably large coffers that he willed together through a bridge bond ploy, a panache for grabbing and holding onto power, and a sick talent for persuading some of the most powerful figures of the 20th century to sign crooked agreements and/or get steamrolled into deals that screwed them over in quite profound ways.

For me, one of the acts that sums Moses up is the way in which he ripped out a major part of Sunset Park’s soul by erecting the Gowanus Expressway above Third Avenue. This is a toxic concrete barrier that still remains as cold and as gray and as unwelcoming as the bleakest rainstorm in December. To this very day, you can still hear the Belt Parkway’s thundering traffic as far away as Sixth Avenue. During a recent walk along Third Avenue on a somewhat chilly afternoon, I surveyed Moses’s handiwork and was nearly mowed down by a minivan barreling out of Costco, its back bulging with wasteful mass-produced goods, as a mad staccato honk pierced my ears with a motive that felt vaguely murderous.

Robert Moses wanted to make New York a city for automobiles, even though he never learned how to drive. And in some of the neighborhoods where his blots against natural urban life remain, his dogged legacy against regular people still persists.

Sunset Park’s residents had begged Moses to build the expressway over Second Avenue. This was closer to the water and the industrial din and might have preserved the many small businesses and happy homes that once punctuated Third Avenue’s happy line. But Moses, citing the recently opened subway that now serves the D, N, and R underneath Fourth Avenue and the available support beams from the soon-to-be-demolished El, was determined to raise a freeway on Third Avenue that he claimed was much cheaper, even though the engineers who weren’t on Moses’s payroll had observed that one mere mile of freeway looping back to the shore wouldn’t substantially reduce the cost. But Moses had fought barons before and had made a few curving compromises while constructing the Northern State Parkway. Armed with the power of eminent domain and a formidable administrative power in which bulldozers and blockades could be summoned against opponents almost as fast as a modern day Seamless delivery, Moses was not about to see his vision vitiated. And if that meant calling the good parts of Sunset Park a “slum,” which it wasn’t, or spouting off any number of lies or threats to destroy perfectly respectable working class neighborhoods, then he’d do it.

As documented by Caro, the Gowanus stretched a raised subway line’s harmless Venetian-blind shadow into a dirty expanse that was nearly two and a half times as wide, wider than a football field and twice as onyx. The traffic lights were so swiftly timed that one had to be a running back to sprint beneath the smog-choking blackness to the other side of the street. The condensation from the steel pillars created such a relentless dripping that it transformed this once sunny thoroughfare into a dirt-clogged river Styx for cars. The cost was seven movie theaters, dozens of restaurants, endless mom and pop stores, butcher shops that raffled Christmas turkeys, and tidy affordable apartments — all shuttered. Moses did not plan for the increased industrial traffic that sprinkled into Sunset Park’s streets, just as he hadn’t for his many other freeways and bridges. Garbage and rats accumulated in the surrounding lots. There was violence and drugs and gang wars. The traffic tightened and slowed to a crawl, demanding more roads, more buildings to gut, more more neighborhoods to disrupt for the worse.

Who was this man? And why was he so determined to assert his will? He fancied himself New York’s answer to Georges-Eugène Haussmann (even reusing a doughnut-shaped building for the 1964 World’s Fair that the Parisian planner himself had put together in 1867), yet didn’t begin to earn a dime for his tyranny until his forties. (He lived off his family’s money and secured early planning jobs by declining a salary.) He thought himself a poet (not an especially good one), but if he had any potential prose style, it turned sour and hard and technocratic by the time he hit Oxford and received his doctorate at Columbia. He worked seemingly every hour of the day and took endless walks, memorizing the precise points where he would later build big parks and tennis courts. And he loved to swim, taking broad strokes well beyond the shores in his sixties and seventies with an endurance and strength that crushed men who were two decades younger. Small wonder that Moses gave the city so many public pools.

After I finished reading The Power Broker, I wanted to know more. I found myself plunging into the collected works of Jane Jacobs (Jacobs’s successful battle to save Washington Square Park was left out by Caro due to the enormity of The Power Broker‘s original manuscript), as well as Anthony Flint’s excellent volume Wrestling with Moses (documenting the battles between Moses and Jacobs), an extremely useful volume edited by Hilary Ballon and Kenneth T. Jackson called Robert Moses and the Modern City that may be the best overview of every Moses project (and attempts, not entirely successfully, to refute some of Caro’s claims), as well as a wonderful graphic novel from Pierre Christin and Olivier Balez (Robert Moses: The Master Builder of New York City) which I recommend for anyone who doesn’t have enough time to read Caro’s 1,200 page biography written in very small print (although you really should read it).

I wanted to know how a man like Moses could operate so long without too many challenging him. His behavior often resembled a spoiled infant braying for his binky. When faced by an authority figure, Moses would often threaten to resign from a position until he got his way. Moses used this tactic so frequently that Mayor La Guardia once sent him a note reading, “Enclosed are your last five or six resignations; I’m starting a new file,” followed by city corporation counsel Paul Windels creating a pad of forms reading “I, Robert Moses, do hereby resign as _______ effective __________,” which further infuriated Moses.

The answer, of course, was through money and influence that Moses had raised through a bridge bond scheme floated through the Triborough Bridge and Tunnel Authority, with Moses as Chairman:

Moses wanted banks to be so anxious to purchase Triborough bonds that they would use all of their immense power to force elected officials to give his public works proposals the approval that would result in their issuance. So although the safety of the banks’ money was already amply assured by Triborough’s current earnings (so great that each year the Authority collected far more money than it spent), by the irrevocable covenants guaranteeing that tolls could never be removed without the bondholders’ consent, and by Triborough’s monopoly, also irrevocable, that guaranteed them that if any future intracity water crossing were built, they would share in its tolls, too, Moses provided them with additional assurances. He maintained huge cash reserves — “Fantastic,” says Jackson Phillips, director of municipal research for Dun and Bradstreet; “the last time I looked they had ten years’ interest on reserve” — and when he floated the Verrazano bonds he agreed to lay aside — in addition to the existing reserves! — 15 percent ($45,000,000) of the cash he received for the new bond issue, and not touch it until the bridge was open and operating five years later. Purchasers of the Verrazano bonds could be all but certain that they could collect their interest every year even if the bridge never collected a single toll. Small wonder that Phillips says, “Triborough’s are just about the best bonds there are.” Wall Streeters may believe that “any investment is a bet,” but Robert Moses was certainly running the safest game in town.

In other words, Moses pulled off one of the most sinister financial games in New York history. The Triborough Authority could not only collect tolls on its bridges and capitalize on these receipts by issuing revenue bonds, which would in turn generate considerable income for Moses to fund his many public works projects, but it was capable of spending more money than the City of New York. Which meant that the city often had to come crawling back to Moses. And if the city or the state wanted to audit the Triborough Authority, this operation was so incredibly complicated that it would require at least fifty accountants working full-time for a year in order to comprehend it. Government did not have this kind of money to place safeguards against Moses. Moreover, it needed Moses’s financial assistance in order to provide for the commonweal.

It wasn’t until 1968, when Governor Nelson Rockefeller and Mayor John Lindsay put an end to these remarkable shenanigans by siphoning tolls into the newly created Metropolitan Transportation Authority. The bondholders might have sued over this. It was, after all, unconstitutional to uproot existing contractual obligations. But Rockefeller’s brother David happened to be the head of Chase Manhattan Bank. And Chase was the largest TBTA bondholder. In a glaring case of “it’s not what you know, it’s who you know,” the Triborough Authority as puppet organization for Moses was finished. Moses was forced to abandon his role. And the man’s political hold on New York was effectively finished after four decades of relentless building and endless resignation threats.

It seemed a fitting end for a man who had maintained such a stranglehold over such a large area. Six years later, Robert Caro’s biography appeared. Moses wrote a 23 page response shortly after the book’s publication. Caro’s rebuttal was five paragraphs, concluding with this one:

It is slightly absurd (but typical of Robert Moses) to label as without documentation a book that has 83 solid pages of single-spaced, small-type notes and that is based on seven years of research, including 522 separate interviews.

Before he became famous for delineating “the paranoid style in American politics” and honing every principled bone against the feverish anti-intellectualism one now sees embodied in everything from long-standing philistine Dan Kois decrying “eating his cultural vegetables” to lunatic presidential candidate Ted Cruz declaring gluten-free meals as a politically correct “social experiment,” historian Richard Hofstadter spent four years on a fiercely independent book that would go on to sell close to a million copies. The American Political Tradition was a Plutarchian overview of illustrious American figures ranging from vivacious abolitionist Wendell Phillips to Woodrow Wilson as closeted conservative. It was aimed at winning over a high-minded American public. Like William Appleman Williams, Hofstadter was very much following in Charles Beard’s footsteps, although this historian hoped to march to his own interpretive drum. Reacting to the toxic McCarthyism at the time, Hofstadter’s cautious defense of old school American liberalism, with the reluctant bulwark hoisted as he poked holes into the foibles of celebrated icons, saddled him with the label of “consensus historian.” With each subsequent volume (most notably The Age of Reform), Hofstadter drifted further away from anything close to a scorching critique of our Founders as hardliners enforcing their economic interests to a more vociferous denouncement of agrarian Populists and numbnuts standing in the way of erudite democratic promise. Yet even as he turned more conservative in later years, Hofstadter insisted that his “assertion of consensus history in 1948 had its sources in the Marxism of the 1930s.”

Such adamantine labels really aren’t fair to Hofstadter’s achievements in The American Political Tradition. The book is by no means perfect, but its Leatherman Wave-like dissection of American history unfolds with some sharp and handy blades. While Hofstadter is strangely reluctant to out Andrew Jackson as a demagogue (“He became a favorite of the people, and might easily come to believe that the people chose well.”) and far too pardonable towards John C. Calhoun, a rigid bloviator with a harsh voice who was one of slavery’s biggest cheerleaders and whose absolutist stance against tariffs under the guise of moderatism would later inspire the South to consider secession as a legitimate nuclear option1“He talked on the most abstruse subjects with the guileless simplicity of a prattling child,” assessed Thomas Edward Watson in The Life and Times of Andrew Jackson. This juicy quote, served up by Hofstadter, comes from one of Calhoun’s admirers!, Hofstadter at his best slices with a necessary critical force into many hallowed patriarchs. For it is the sum of their variegated and contradictory parts that has caused some to view the American trajectory in Manichean terms.

One of the book’s standout chapters is Hofstadter’s shrewd analysis of Lincoln as an exceptionally formidable man who dialed down his egalitarian ardor to zero the meter for his shrewd and very rapid political rise. In just four years, Lincoln advanced from an obscure attorney in Illinois to a prominent party leader in that same state’s House of Representatives. But Hofstadter cogently argues that Lincoln was far from the outspoken abolitionist who would later lay down some very strong words against those who would deny other people freedom. Lincoln not only kept his enemies closer than his friends, but he was exceptionally careful with his rhetoric, even though one eye-popping 1836 declaration proposed extending suffrage to women.2One of Lincoln’s fascinating strategies involved sneaking in bold political positions into bills or campaigning that often reflected an altogether different goal, such as this oft cited passage protesting income inequality in a defense of protective tariffs. The quote is cited by Hofstadter. Much as Franklin D. Roosevelt was very savvy about letting his political opponents make the first move before he acted, Lincoln used the Declaration of Independence’s very text as ammunition and inspiration for his justification for abolition, which come much later — Lincoln’s first public condemnation of slavery arrived when Lincoln was forty-five — than Lincoln’s many admirers are often willing to admit.

Hofstadter points out that Lincoln’s seeming contradiction between revolutionary politics and pragmatic interpretation of the law was not especially peculiar, but part of a nuts-and-bolts perpetuation of an ongoing political tradition, one that can be seen with Lincoln’s hard maneuvering with the 1851 conditional loan he issued to his stepbrother John D. Johnson. Lincoln’s famous House Divided speech was masterful rhetoric urging national reconciliation of the slavery issue, but he didn’t exactly go out of his way to out himself as an abolitionist. Hofstadter points out that in 1858, seemingly Honest Abe spoke in two entirely different manners about racial equality in Chicago and in Charleston (see the second paragraph of his first speech). Yet these observations not only illustrate Lincoln’s political genius, but invite parallels to Lyndon Johnson’s brilliant and equally contradictory engineering in passing the 1957 Civil Rights Act (perhaps best chronicled in a gripping 100 page section of Robert A. Caro’s excellent Master of the Senate). The American political tradition, which Hofstadter identifies as a continuity with capitalist democratic principles, is seen today with Hillary Clinton struggling against a young population hungry for progressive change unlikely to happen overnight, despite Bernie Sanders’s valiant plans and the immediate need to rectify corporate America’s viselike hold on the very democratic principles that have sustained this nation for more than two hundred years.

Yet this is the same tradition that has given us long years without a stabilizing central bank, the Trail of Tears, the Civil War, the Credit Mobilier scandal, robber barons, and Hoover’s unshakable faith that “prosperity was just around the corner,” among many other disgraces. Hofstadter is thankfully not above condemning lasseiz-faire absolutism, such as Grover Cleveland’s unrealistic assumption that “things must work out smoothly without government action, or the whole system, coherent enough in theory, would fall from the weakness of its premises” or the free silver campaign that buttressed the bombastic William Jennings Bryan into an improbable presidential candidate. On Bryan, Hofstadter describes his intellectual acumen as “a boy who never left home” and one can see some of Bryan’s regrettable legacy in the red-faced fulminations of a certain overgrown boy who currently pledges to make America great again. A careless and clumsy figure like Bryan was the very antithesis of Lincoln. Bryan failed to see difficult political tasks through to their necessary end. He would adopt principles that he once decried. His well-meaning efforts amounted to practically nothing. Think of Bryan as Fargo‘s Jerry Lundegaard to Lincoln’s Joe Girard. Hofstadter suggests that “steadfast and self-confident intelligence,” perhaps more important than courage and sincerity, was the very quality that Bryan and this nation so desperately needed. Yet in writing about Teddy Roosevelt and pointing to the frequency of “manly” and “masterful” in his prose, Hofstadter imputes that these “more perfect” personal qualities for the political tradition “easily became transformed into the imperial impulse.”

This is, at times, a very grumpy book. One almost bemoans the missed opportunity to enlist the late Andy Rooney to read aloud the audio version. But it is not without its optimism. Hofstadter places most of his faith in abolitionist agitator Wendell Phillips. But even after defending Phillips from numerous historical condemnations and pointing to Phillips’s “higher level of intellectual self-awareness,” Hofstadter sees the agitator as merely “the counterweight to sloth and indifference.” But Hofstadter, at this young stage of his career, isn’t quite willing to write off agitators. He does point to why Phillips was a necessary and influential force providing equilibrium:

But when a social crisis or revolutionary period at last matures, the sharp distinctions that govern the logical and doctrinaire mind of the agitator become at one with the realities, and he appears overnight to the people as a plausible and forceful thinker. The man who has maintained that all history is the history of class struggles and has appeared so wide of the mark in times of class collaboration may become a powerful leader when society is seething with unresolved class conflict; the man who has been valiantly demanding the abolition of slavery for thirty years may become a vital figure when emancipation makes its appearance as a burning issue of practical politics. Such was the experience of Wendell Phillips: although he never held office, he became one of the most influential Americans during the few years after the fall of Fort Sumter.

The question of whether you believe Hofstadter to be a consensus historian or not may depend on how much you believe that he viewed the American political tradition much like two Lazaruses forever duking it out for existence in the old Star Trek episode “The Alternative Factor.” He certainly sees a nation of political pragmatists and obdurate agitators caught in an eternal dead lock, which is not too far from the progressive historians who styled their interpretations on class conflict. But his fine eye for ferreting out the Burkean undertow within Woodrow Wilson’s putative liberalism or exposing how Hoover’s faith in unregulated business had him quivering with disbelief after Black Thursday suggests a historian who is interested in countering ideological bromides. Perhaps if Hofstadter had stretched some of his chapters across a massive book, his reputation as a consensus historian wouldn’t have been the subject of so many heated arguments among political wonks.

Fortunately, the next Modern Library essay in this series will investigate how one man fluctuated his politics to serve his own ends and reshaped a major metropolis through the iron will of his personality. That very long and very great book may be the key that turns the consensus lock. It will certainly tell us a lot more about political power.

Herbert Croly’s 1909 book is a thoughtful and fairly ambitious effort to reconcile Hamiltonian and Jeffersonian thought, but does Croly have any real application in today’s cartoonish political atmosphere? Can any American “belong” to a government in the 21st century?

Before The New Republic devolved under Chris Hughes into a half-worthy husk of knee-jerk platitudes just a few histrionic clickbait headlines shy of wily Slate reductionism, it was a formidable liberal magazine for many decades, courageous enough to take real stands while sustaining vital dialogue about how and when government should intercede in important affairs. The source of this philosophical thrust, as duly documented by Franklin Foer, was the greatly diffident son of a prominent newspaperman, an unlikely progenitor who entered and exited Harvard many times without ever finishing, someone who suffered from severe depression and who, for a time, didn’t know what to do with his life other than play bridge and tennis and write about obscure architecture. But Croly found it in him to spill his views about democracy’s potential, what he called the “New Nationalism,” into a 1909 book called The Promise of American Life, which served as something of a manifesto for the early 20th century Progressives and became a cult hit among political wonks at the time. It partially inspired Theodore Roosevelt, who was proudly name-checked by Croly as “a Hamiltonian with a difference,” to initiate his ill-fated 1912 Bull Moose campaign as an outsider presidential candidate. (Historians have argued over the palpable influence of Croly’s book on Roosevelt, but it’s possible that, had not Croly confirmed what Roosevelt had already been thinking about, Roosevelt may not have entered the 1912 race as ardently as he did. With a more united Republican coalition against Wilson, America may very well have carried on with a second Taft term, with an altogether different involvement in World War I. Taft’s notable rulings as Chief Justice of the Supreme Court, which included extending executive power and broadening the scope of police evidence, may not been carried out in the 1920s. A book is often more of a Molotov shattering upon history’s turf than we are willing to accept.)

Croly’s book touched a nerve among a small passionate group. One couple ended up reading Croly’s book aloud to each other during their honeymoon (leaving this 21st century reader, comparing Croly’s thick “irremediable”-heavy prose style against now all too common sybaritic options, to imagine other important activities that this nubile pair may have missed out on). The newly married couple was Willard Straight and Dorothy Whitney. They had money. They invited Croly to lunch. The New Republic was formed.

So we are contending with a book that not only created an enduring magazine and possibly altered the course of American history, but one that had a profound impact on the right elite at the right time. So it was a tremendous surprise to discover a book that greatly infuriated me during the two times I read it, at one time causing me to hurl it with high indignant velocity against a wall, for reasons that have more to do with this gushing early 20th century idealist failing to foresee the rise of Nazism, the despicable marriage of racism and police brutality, growing income inequality, corporate oligarchy, draconian Common Core educational standards, and dangerous demagogues like George Wallace and Donald Trump.

But it is also important to remember that Croly wrote this book before radio, television, the Internet, women’s suffrage, two world wars, the Great Depression, smartphones, outrage culture, and 9/11. And it is never a good idea to read an older book, especially one of a political nature, without considering the time that it was written. I did my best to curb my instincts to loathe Croly for what he could not anticipate, for his larger questions of how power aligns itself with the democratic will of the people are still very much worth considering. Croly is quite right to identify the strange Frankenstein monster of Alexander Hamilton’s pragmatic central government and Thomas Jefferson’s rights of man — the uniquely American philosophical conflict that has been the basis of nearly every national conflict and problem that has followed — as a “double perversion” of our nation’s potential, even if Croly seems unwilling to consider that some “perversions” are necessary for an evolving democratic republic and he is often too trusting of executive authority and the general public’s obeisance to it. That these inquiries still remain irreconcilable (and are perverted blunter still by crass politicians who bellow about how to “make America great again” as they eject those who challenge them from the room) some 107 years after the book’s publication speaks to both the necessity and the difficulty of the question.

I’ve juxtaposed Croly’s meek-looking law clerk mien against George Bellows’s famous boxing painting (unveiled two years before Croly’s book) because there really is no better way to visualize the American individual’s relationship to its lumbering, venal, and often futile government. Croly’s solution is to call for all Americans to be actively engaged in a collaborative and faithful relationship with the nation: “to accept a conception of democracy which provides for the substantial integrity of his country, not only as a nation with an exclusively democratic mission, but as a democracy with an essentially national career.” On its face, this seems like a reasonable proposition. We all wish to belong in a democracy, to maintain fidelity to our country, and to believe that the Lockean social contract in which the state provides for the commonweal is a workable and reasonable quid pro quo. But it is also the kind of orgiastic meat and potatoes mantra that led both Kennedy and Reagan to evoke mythical American exceptionalism with the infamous “shining city upon a hill” metaphor. Dulcet words may make us feel better about ourselves and our nation, but we have seen again and again how government inaction on guns and a minimum wage that does not reflect contemporary living standards demands a Black Lives Matter movement and a “fight for $15.” And when one begins to unpack just what Croly wants us to give up for this roseate and wholly unrealistic Faustian bargain, we begin to see someone who may be more of a thoughtful and naive grandstander than a vital conceptual pragmatist.

Croly is right to demand that America operate with a larger administrative organ in place, some highly efficient Hamiltonian body that mitigates against “the evil effects of a loose union.” He smartly points out that such evils as slavery resulted from the American contradictions originating in the strange alliance between our poetic Jeffersonian call for Constitutional democracy and individualistic will and the many strains of populism and nationalism that followed. In his insistence on “the transformation of Hamiltonianism into a thoroughly democratic political principle,” Croly is suspicious of reformers, many of which he singles out in a manner strikingly similar to Norman Mailer’s “Quick and Expensive Comments on the Talent in the Room.” He calls William Jennings Bryan an “ill conceived” reformer, claims the now nearly forgotten William Travers Jerome to be “lulled into repose” by traditional Jeffersonian democracy (never mind Jerome’s successful crusades against Tammany Hall corruption, regrettably overshadowed by his prosecution of Harry K. Thaw during the Stanford White murder trial), interestingly pegs William Randolph Hearst as someone motivated by endless “proclaimation[s] of a rigorous interpretation of the principle of equal rights,” and holds up Teddy Roosevelt as “more novel and more radical” in his calls for a Square Deal than “he himself has probably proclaimed.”

But Croly’s position on reform is quite problematic, deeply unsettling, and often contradictory. He believes that citizens “should be permitted every opportunity to protest in the most vigorous and persistent manner,” yet he states that such protests “must conform to certain conditions” enforced by the state. While we are certainly far removed from the 1910 bombing of the Los Angeles Times building that galvanized the labor movement, as we saw with the appalling free speech cages during the 2004 Republican Convention, muzzling protesters not only attenuated their message but allowed the NYPD to set up traps for the activists, which ensured their arrest and detention — a prototype for the exorbitant enforcement used to diminish and belittle the Occupy Wall Street movement a few years later. Croly believes that the job of sustaining democratic promise should, oddly enough, be left to legislators and executives granted all the power required and sees state and municipal governments as largely unsuccessful:

The interest of individual liberty in relation to the organization of democracy demands simply that the individual officeholder should possess an amount of power and independence adequate to the efficient performance of his work. The work of a justice of the Supreme Court demands a power that is absolute for its own special work, and it demands technically complete independence. An executive should, as a rule, serve for a longer term, and hold a position of greater independence than a legislator, because his work of enforcing the laws and attending to the business details of government demands continuity, complete responsibility within its own sphere, and the necessity occasionally of braving adverse currents of public opinion. The term of service and the technical independence of a legislator might well be more restricted than that of an executive; but even a legislator should be granted as much power and independence as he may need for the official performance of his public duty. The American democracy has shown its enmity to individual political liberty, not because it has required its political favorites constantly to seek reëlection, but because it has since 1800 tended to refuse to its favorites during their official term as much power and independence as is needed for administrative, legislative, and judicial efficiency. It has been jealous of the power it delegated, and has tried to take away with one hand what it gave with the other.

There is no room for “Act locally, think globally” in Croly’s vision. This is especially ungenerous given the many successful progressive movements that flourished decades after Croly’s death, such as the civil rights movement beginning with local sit-ins and developing into a more cogent and less ragged strain of the destructive Jacksonian populism that Croly rightly calls out, especially in relation to the cavalier obliteration of the Second Bank of the United States and the Nullification Crisis of 1832, which required Henry Clay to clean up Jackson’s despotic absolutism with a compromise. On the Nullification point, Croly identifies Daniel Webster, a man who became treacherously committed to holding the Union together, as “the most eloquent and effective expositor of American nationalism,” who “taught American public opinion to consider the Union as the core and crown of the American political system,” even as he offers a beautifully stinging barb on Webster’s abolitionist betrayal with the 1850 speech endorsing the Fugitive Slave Act: “He was as much terrorized by the possible consequences of any candid and courageous dealing with the question as were the prosperous business men of the North; and his luminous intelligence shed no light upon a question, which evaded his Constitutional theories, terrified his will, and clouded the radiance of his patriotic visions.”

But Croly also promulgates a number of loopy schemes, including making representative legislatures at any level beholden to an executive who is armed with a near tyrannical ability to scuttle laws, even as he claims that voters removing representatives through referendum “will obtain and keep a much more complete and direct control over the making of their laws than that which they have exerted hitherto; and the possible desirability of the direct exercise of this function cannot be disputed by any loyal democrat.” Well, this loyal democrat, immediately summoning Lord Acton’s famous quote, calls bullshit on giving any two-bit boss that kind of absolute power. Because Croly’s baffling notion of “democracy” conjures up the terrifying image of a sea of hands raised in a Bellamy salute. On one hand, Croly believes that a democracy must secure and exercise individual rights, even as he rightly recognizes that, when people exercise these rights, they cultivate the “tendency to divide the community into divergent classes.” On the other hand, he believes that individuals should be kept on a restrictive leash:

[T]hey should not, so far as possible, be allowed to outlast their own utility. They must continue to be earned. It is power and opportunity enjoyed without being earned which help to damage the individual — both the individuals who benefit and the individuals who consent — and which tend to loosen the ultimate social bond. A democracy, no less than a monarchy or an aristocracy, must recognize political, economic, and social discriminations, but it must also manage to withdraw its consent whenever these discriminations show any tendency to excessive endurance. The essential wholeness of the community depends absolutely on the ceaseless creation of a political, economic, and social aristocracy and their equally incessant replacement.

There’s certainly something to be said about how many Americans fail to appreciate the rights that they have. Reminding all citizens of their duties to flex their individual rights may be a very sound idea. (Perhaps one solution to American indifference and political disillusion is the implementation of a compulsory voting policy with penalties, similar to what goes on in Australia.) But with such a middling door prize like this handed out at the democratic dance party, why on earth would any individual want to subscribe to the American promise? Aristocrats, by their very nature, wish to hold onto their power and privilege and not let go. Croly’s pact is thus equally unappealing for the struggling individual living paycheck to paycheck, the career politician, or the business tycoon.

Moreover, in addition to opposing the Sherman Antitrust Act, Croly nearly succumbs to total Taylorism in his dismissal of labor unions: “They seek by the passage of eight-hour and prevailing rate-of-wages laws to give an official sanction to the claims of the unions, and they do so without making any attempt to promote the parallel public interest in an increasing efficiency of labor. But these eight-hour and other similar laws are frequently being declared unconstitutional by the state courts, and for the supposed benefit of individual liberty.” Granted, Croly’s words came ten years before the passage of the Adamson Act, the first federal law enforcing a mandatory eight-hour day. But Croly’s failure to see the social benefits of well-rested workers better positioned to exercise their individual liberty for a democratic promise is one of his more outrageous and myopic pronouncements, even as he also avers how the conditions that create unrestricted economic opportunities also spawn individual bondage. But if Croly wants Americans to “[keep] his flag flying at any personal cost or sacrifice,” then he really needs to have more sympathy for the travails of the working stiff.

Despite all my complaints, I still believe some 21st century thinker should pick up from Croly’s many points and make an equally ambitious attempt to harmonize Hamilton and Jefferson with more recent developments. American politics has transformed into a cartoonish nightmare from which we cannot seem to escape, one that causes tax absolutist lunatics like Grover Norquist to appear remotely sane. That we are seeing a strange replay of the 1912 election with the 2016 presidential race, with Trump stepping in as an unlikely Roosevelt and Bernie Sanders possibly filling in for Eugene Debs, and that so many Americans covet an “outsider” candidate who will fix a government that they perceive as a broken system speaks to a great need for some ambitious mind to reassess our history and the manner in which we belong to our nation, while also observing the many ways in which Americans come together well outside of the political bear trap. For the American individual is no longer boxing George Bellows-style with her government. She is now engaged in a vicious MMA match unfurling inside a steel cage. Whether this ugly pugilism can be tempered with peace and tolerance is anyone’s guess, but, if we really believe in democracy, the least we can do is try to find some workaround in which people feel once again that they’re part of the process.

Truman Capote was a feverish liar and a frenzied opportunist from the first moment his high voice pierced the walls of a literary elite eager to filth up its antimacassars with gossip. He used his looks to present himself as a child prodigy, famously photographed in languorous repose by Harold Halma to incite intrigue and controversy. He claimed to win national awards for his high school writing that no scholar has ever been able to turn up. He escorted the nearly blind James Thurber to his dalliances with secretaries and deliberately put on Thurber’s socks inside out so that his wife would notice, later boasting that one secretary was “the ugliest thing you’ve ever seen.” Biographer Gerald Clarke chronicled how Capote befriended New Yorker office manager Daise Terry, who was feared and disliked by many at the magazine, because he knew she could help him. (Capote’s tactics paid off. Terry gave him the easiest job on staff: copyboy on the art department.) If Capote wanted to know you, he wanted to use you. But the beginnings of a man willing to do just about anything to get ahead can be found in his early childhood.

Capote’s cousin Jennings Faulk Carter once described young Truman coming up with the idea of charging admission for a circus. Capote had heard a story in the local paper about a two-headed chicken. Lacking the creative talent to build a chicken himself, he enlisted Carter and Harper Lee for this faux poultry con. The two accomplices never saw any of the money. Decades later, Capote would escalate this tactic on a grander scale, earning millions of dollars and great renown for hoisting a literary big top over a small Kansas town after reading a 300 word item about a family murder in The New York Times. Harper Lee would be dragged into this carnival as well.

The tale of how two frightening men murdered four members of the Clutter family for a pittance and created a climate of fear in the surrounding rural area (and later the nation) is very familiar to nearly anyone who reads, buttressed by the gritty 1967 film (featuring a pre-The Walking Dead Scott Wilson as Dick Hickock and a pre-Bonnie Lee Bakley murder Robert Blake as Perry Smith) and a deservedly acclaimed 2005 film featuring the late great Philip Seymour Hoffman as Capote. But what is not so discussed is the rather flimsy foundation on which this “masterpiece” has been built.

Years before “based on a true story” became a risible cliche, Capote and his publicists framed In Cold Blood‘s authenticity around Capote’s purported accuracy. Yet the book itself contains many gaping holes in which we have only Smith and Hickock’s words, twisted further by Capote. What are we to make of Bill and Johnny — a boy and his grandfather who Smith and Hickock pick up for a roadside soda bottle-collecting adventure to make a few bucks? In our modern age, we would demand the competent journalist to track these two side characters down, to compare their accounts with those of Smith and Hickock. Capote claims that these two had once lived with the boy’s aunt on a farm near Shreveport, Louisiana, yet no independent party appears to have corroborated their identities. Did Capote (or Hickock and Smith) make them up? Does the episode really contribute to our understanding of the killers’ pathology? One doesn’t need to be aware of a recent DNA test that disproved Hickock and Smith’s involvement with the quadruple murder of the Walker family in Sarasota County, Florida, taking place one month after the Clutter murders, to see that Capote is more interested in holding up the funhouse mirror to impart specious complicity:

Hickock consented to take the [polygraph] test and so did Smith, who told Kansas authorities, “I remarked at the time, I said to Dick, I’ll bet whoever did this must be somebody that read about what happened out here in Kansas. A nut.” The results of the test, to the dismay of Osprey’s sheriff as well as Alvin Dewey, who does not believe in exceptional circumstances, were decisively negative.

Never mind that polygraph tests are inaccurate. It isn’t so much Hickock and Smith’s motivations that Capote was interested in. He was more concerned with stretching out a sense of amorphous terror on a wide canvas. As Hickock and Smith await to be hanged, they encounter Lowell Lee Andrews in the adjacent cell. He is a fiercely intelligent, corpulent eighteen-year-old boy who fulfilled his dormant dreams of murdering his family, but Capote’s portrait leaves little room for subtlety:

For the secret Lowell Lee, the one concealed inside the shy church going biology student, fancied himself an ice-hearted master criminal: he wanted to wear gangsterish silk shirts and drive scarlet sports cars; he wanted to be recognized as no mere bespectacled, bookish, overweight, virginal schoolboy; and while he did not dislike any member of his family, at least not consciously, murdering them seemed the swiftest, most sensible way of implementing the fantasies that possessed him.

We have modifiers (“shy,” “ice-hearted,” “gangsterish,” “silk,” “scarlet,” “bespectacled,” “bookish,” “virginal,” “swiftest,” and “sensible”) that conjure up a fantasy atop the fantasy, that suggest relativism to the two main heavies, but there is little room for subtlety or for any doubt in the reader’s mind. Capote does bring up the fact that Andrews suffered from schizophrenia, but diminishes this mental illness by calling it “simple” before dredging up the M’Naghten Rule, which was devised in 1843 (still on the books well before psychiatry existed and predicated upon a 19th century standard) to exclude any insanity defense whereby the accused recognizes right from wrong. But he has already tarnished Andrews with testimony from Dr. Joseph Satten: “He considered himself the only important, only significant person in the world. And in his own seclusive world it seemed to him just as right to kill his mother as to kill an animal or a fly.” I certainly don’t want to defend Andrews’s crime (much less the Clutter family murders), but this conveniently pat assessment does ignore more difficult and far more interesting questions that Capote lacks the coherence, the empathy, or the candor to confess his own contradictions to pursue. Many pages before, in relation to Hickock, Capote calls M’Naghten “a formula quite color-blind to any gradations between black and white.” In other words, Capote is the worst kind of journalist: a cherry-picking sensationalist who applies standards as he sees fit, heavily steering the reader’s opinion even as he feigns objectivity. The ethical reader reads In Cold Blood in the 21st century, wanting Katherine Boo to emerge from the future through a wormhole, if only to open up a can of whoopass on Capote for these egregious and often thoughtless indiscretions.

Capote’s decision to remove himself from the crisp, lurid story was commended by many during In Cold Blood‘s immediate reception as a feat of unparalleled objectivity, with the “nonfiction novel” label sticking to the book like a trendy hashtag that hipsters refuse to surrender, but I think Cynthia Ozick described the thorny predicament best in her infamous driveby on Capote (collected in Art & Ardor): “Essence without existence; to achieve the alp of truth without the risk of the footing.” If we accept any novel — whether “nonfiction” or fully imaginative — as some sinister or benign cousin to the essay, as a reasonably honest attempt to reckon with the human experience through invention, then In Cold Blood is a failure: the work of a man who sat idly in his tony Manhattan spread with cadged notebooks and totaled recall of aggressively acquired conversations even as his murderous subjects begged their “friend” to help them escape the hangman’s noose.

In 2013, Slate‘s Ben Yagoda described numerous factual indiscretions, revealing that editor William Shawn had penciled in “How know?” on the New Yorker galley proofs of Capote’s four part opus (In Cold Blood first appeared in magazine form). That same year, the Wall Street Journaluncovered new evidence from the Kansas Bureau of Investigation, which revealed that the KBI did not, upon receiving intelligence from informant Floyd Wells, swiftly dispatch agent Harold Nye to the farmhouse where Richard Hickock had lodged. (“It was as though some visitor were expected,” writes Capote. Expected by Hickock’s father or an author conveniently tampering his narrative like a subway commuter feverishly filling in a sudoku puzzle?) As Jack de Bellis has observed, Capote’s revisions from New Yorker articles to book form revealed Capote’s feeble command of time, directions, and even specific places. But de Bellis’s examination revealed more descriptive imprudence, such as Capote shifting a line on how Perry “couldn’t stand” another prisoner to “could have boiled him in oil” (“How know?” we can ask today), along with many efforts to coarsen the language and tweak punctuation for a sensationalist audience.

And then there is the propped up hero Alvin Dewey, presented by Capote as a tireless investigator who consumes almost nothing but coffee and who loses twenty pounds: a police procedural stereotype if ever there was one. Dewey disputes closing his eyes during the execution and the closing scene of Dewey meeting Nancy Clutter’s best friend, Susan Kidwell, in a cemetery is not only invented, but heavily mimics the belabored ending of Capote’s 1951 novel, The Grass Harp. But then “Foxy” Dewey and Capote were tighter than a pair of frisky lovers holed up for a week in a seedy motel.

Capote was not only granted unprecedented access to internal documents, but Capote’s papers reveal that Dewey provided Capote with stage directions in the police interview transcripts. (One such annotation reads “Perry turns white. Looked at the ceiling. Swallows.”) There is also the highly suspect payola of Columbia Pictures offering Dewey’s wife a job as a consultant on the 1965 film for a fairly substantial fee. Harold Nye, another investigator whose contributions have been smudged out of the history told Charles J. Shields in a December 30, 2002 interview (quoted in Mockingbird), “I really got upset when I know that Al [Dewey] gave them a full set of the reports. That was like committing the largest sin there was, because the bureau absolutely would not stand for that at all. If it would have been found out, he would have been discharged immediately from the bureau.”

In fact, Harold Nye and other KBI agents did much of the footwork that Capote attributes to Dewey. Nye was so incensed by Capote’s prevarications that he read 115 pages of In Cold Blood before hurling the book across the living room. And in the last few years, the Nye family has been fighting to reveal the details inside two tattered notebooks that contain revelations about the Clutter killings that may drastically challenge Capote’s narrative.

Yet even before this, Capote’s magnum opus was up for debate. In June 1966, Esquire published an article by Phillip K. Tompkins challenging Capote’s alleged objectivity. He journeyed to Kansas and discovered that Nancy Clutter’s boyfriend was hardly the ace athlete (“And now, after helping clear the dining table of all its holiday dishes, that was what he decided to do that — put on a sweatshirt and go for a run.”) that Capote presented him as, that Nancy’s horse was sold for a higher sum to the father of the local postmaster rather than “a Mennonite farmer who said he might use her for plowing,” and that the undersheriff’s wife disputed Capote’s account:

During our telephone conversation, Mrs. Meier repeatedly told me that she never heard Perry cry; that on the day in question she was in her bedroom, not the kitchen; that she did not turn on the radio to drown out the sound of crying; that she did not hold Perry’s hand; that she did not hear Perry say, ‘I’m embraced by shame.’ And finally – that she had never told such things to Capote. Ms. Meier told me repeatedly and firmly, in her gentle way, that these things were not true.

(For more on Capote’s libertine liberties, see Chapter 4 of Ralph F. Voss’s Truman Capote and the Legacy of In Cold Blood.)

Confronted by these many disgraceful distortions, we are left to ignore the “journalist” and assess the execution. On a strictly showboating criteria, In Cold Blood succeeds and captures our imagination, even if one feels compelled to take a cold shower knowing that Capote’s factual indiscretions were committed with a blatant disregard for the truth, not unlike two psychopaths murdering a family because they believed the Clutters possessed a safe bountiful with riches. One admires the way that Capote describes newsmen “[slapping] frozen ears with ungloved, freezing hands,” even as one winces at the way Capote plays into patriarchal shorthand when Nye “visits” Barbara Johnson (Perry Smith’s only surviving sister: the other two committed suicide), describing her father as a “real man” who had once “survived a winter alone in the Alaskan wilderness.” The strained metaphor of two gray tomcats — “thin, dirty strays with strange and clever habits” – wandering around Garden City during the Smith-Hickcock trial allows Capote to pad out his narrative after he has exhausted his supply of “flat,” “dull,” “dusty,” “austere,” and “stark” to describe Kansas in the manner of some sheltered socialite referencing the “flyover states.” Yet for all these cliches, In Cold Blood contains an inexplicably hypnotic allure, a hold upon our attention even as the book remains aggressively committed to the facile conclusion that the world is populated by people capable of murdering a family over an amount somewhere “between forty and fifty dollars.” As Jimmy Breslin put it (quoted in M. Thomas Inge’s Conversations with Truman Capote), “This Capote steps in with flat, objective, terrible realism. And suddenly there is nothing else you want to read.”

That the book endures — and is even being adapted into a forthcoming “miniseries event’ by playwright Kevin Hood — speaks to an incurable gossipy strain in Western culture, one reinforced by the recent success of the podcast Serial and the television series The Jinx. It isn’t so much the facts that concern our preoccupation with true crime, but the sense that we are vicariously aligned with the fallible journalist pursuing the story, who we can entrust to dig up scandalous dirt as we crack open our peanuts waiting for the next act. If the investigator is honest about her inadequacies, as Serial‘s Sarah Koenig most certainly was, the results can provide breathtaking insight into the manner in which we incriminate other people with our emotional assumptions and our fallible memories and superficially examined evidence. But if the “journalist” removes himself from culpability, presenting himself as some demigod beyond question or reproach (Capote’s varying percentages of total recall memory certainly feel like some newly wrangled whiz kid bragging about his chops before the knowledge bowl), then the author is not so much a sensitive artist seeking new ways inside the darkest realm of humanity, but a crude huckster occupying an outsize stage, waiting to grab his lucrative check and attentive accolades while the real victims of devastation weep over concerns that are far more human and far more deserving of our attention. We can remove the elephants from the lineup, but the circus train still rolls on.

How can a bumbling statesman from the 19th century help us cultivate more tolerance for humankind? Lord David Cecil’s biography offers some unexpected possibilities. We examine the curious case of William Lamb Melbourne in the first of our Modern Library Nonfiction essays.

History has produced such a rich pile of devious political figures who spend every spare minute scheming and plotting their rise that today’s aspiring aristocrats, who can be found working every connection to get their kids into bright educational citadels and reliable sinecures, cannot come close to such cutthroat monomania. Yet there are also those who blunder into top office like bumpkins crashing high-class weddings through the simple repetitive act of placing one foot in front the other. William Lamb, aka Lord Melbourne, Prime Minister of the United Kingdom for eight years (1834, 1835-1841) and the subject of Lord David Cecil’s generous biography, was one such man.

Melbourne was a ponderous speaker, a bookish man who reportedly dozed off in the middle of a conversation, a leader largely blind to the way people beneath his station lived, and a cautionary tale for any soul who avoids conflict. “He pondered, he compared, he memorized,” writes Cecil of Melbourne in middle age, just before his improbable political career begins, “the Elizabethan drama, for instance, he knew so well that he could repeat by heart whole scenes not only of Shakespeare but of Massinger; the margins of his books were black with the markings of his flowing, illegible hand.” Melbourne was so mind-bogglingly passive in his actions that he not only refused to intervene in his wife’s adulterous affairs, but he believed that each infidelity would pass with the fleeting speed of a common cold.

It’s easy to ridicule Melbourne and the people of that time (as the bitterly judgmental Carrie A.A. Frye and Philip Ziegler, another Melbourne biographer, have). Melbourne was cuckolded by Lord Byron. When Byron would no longer plant his flagpole in Lady Caroline Lamb, Melbourne’s wife gradually traded down in her boys on the side until she tapped Edward Bulwer-Lytton (best known today for inspiring a writing contest for wretched writing). Caroline would go on to write Glenarvon, an awful roman à clef which exposed the Byron episode and left the Lambs open to disgrace and derision. When his wife died, the lonely Melbourne sought solace with another Caroline (the remarkable Caroline Norton, a tireless crusader who would go on to campaign successfully for legislative acts rectifying the second-class status of women), there was an attempt at blackmail. When Queen Victoria ascended to the throne, Melbourne became her unlikely tutor and constant companion, wasting a good chunk of his late years because the young Queen required constant attention (much of it documented in Victoria’s journals, which are now digitized and accessible through arrangement with your library; Cecil is good enough to quote from many of these entries).

When I learned from Paul Douglass’s Lady Caroline Lamb just how abhorrently Melbourne had treated the largely forgotten badass Isaac Nathan, I began to grow less tolerant of Melbourne’s nonplussed nature. Nathan — a Jewish composer who wrote the groundbreaking Hebrew Melodies and suffered from the Jewish exclusion laws which denied him the ability to vote, run for office, or pursue justice in the courts against the scabrous opportunists who stole his lyrics, often without credit or compensation — befriended Caroline and set many of her words to music. Despite Nathan defending Caroline when she was disgraced (to the lively extent of fighting duels and even publishing a defense of her character in Fugitive Pieces after her death), Melbourne refused to pay Nathan for services rendered to the Whigs when Nathan really needed the cash, leaving Nathan humiliated and bankrupt and forced to flee to Australia, where he was to write Don John of Austria (the first opera composed and performed in Australia), became the first to research indigenous music and the first settler in this new and exciting country to be killed by a tram. (Don’t worry. Nathan died at 74. It was an accident.)

Melbourne’s clueless cruelty also emerged as organized labor became a more vocal part of British life. In March 1834, a group of laborers in Dorset started a trade-union and it was discovered that these men had administered secret oaths as part of the membership. Several of these men were arrested and sentenced to seven years of penal transportation. At the time, Melbourne was Home Secretary. Instead of overturning this remarkably harsh punishment, Melbourne asked the local magistrates about the temperament of these men. He was informed by these tendentious adjudicators that they were scoundrels. Melbourne, strictly on this point of secret oaths, confirmed this inhumane sentence. And during the next month, a group of thirty thousand marched to Whitehall to demand redress. He refused to see these leaders. And this austere decision set back the trade-union movement for years. As Cecil writes:

“So far from being criminals and revolutionaries, they were sober, respectable men enough, driven into lawless courses largely by ignorance and hunger and by the struggle to hang up their families on wages lately reduced to seven shillings a week. Melbourne was not to blame for not realizing their true characters. He was not there, and he had to trust to the reports of his subordinates.”

Melbourne also did not intervene when the revolting laborers of 1830 (during the Swing Riots) were sent to the gallows. Even Cecil is forced to accept the “painful and disturbing” prospect of an ostensibly kind-hearted man who wished to uphold the death sentence even when prisoners were not intended to be executed. Yet as callous as these consequences were, contributing to great unrest in the immediate years that followed, one has to remember that these terrible measures emerged in response to fears over recent turmoil in France and while parliamentary reform was being hashed out at a frustratingly glacial pace. There was a palpable anxiety that events across the English Channel would be reenacted at home.

How did such a man ascend to Prime Minister? Largely because there was nobody else. In 1834, King William IV needed someone who could keep variegated political factions together and, although the King didn’t care much for Melbourne, he liked him better than the other candidates. After all, this wasn’t exactly a position that you could leave open because you didn’t care for the present spate of résumés. Melbourne almost didn’t take the job. (Indeed, near the end of his stint as Prime Minister, he drifted forward with listlessness and exhaustion. Obligation seemed to be the only quality that kept him going.) An opportunistic little creep named Tom Young, who ensconced himself into Melbourne’s administrative circle through skillful cunning, was the one who played to Melbourne’s vanity and love for the classics, securing Melbourne’s commitment with the following words: “Why, damn it all, such a position was never held by any Greek or Roman: and if it only last three months, it will be worth while to have been Prime Minister of England.”

Melbourne was a weird Prime Minister. His strategy involved ridding himself of loud and querulous colleagues and keeping the new Government as calm as possible, even as he remained obdurate in his decision making. This approach did not sit well with some of the more boisterous statesmen. In a moment that could almost be pulled out of a David Lynch movie, Cecil describes Lord Henry Brougham visiting Melbourne’s house just after learning that he would not have a place in the new government. “Do you think I am mad?” shouts Brougham over and over again, his tone and gestures rising with violence as he repeats this question, almost anticipating the charismatic psychosis of Blue Velvet‘s Frank Booth. Yet the highly avoidant Melbourne could not fend off the King, assorted radicals, and any number of people who beseeched him for attention. “Damn it!” he cried to himself. “Why can’t they be quiet?” (In 1836, the aforementioned Norton blackmail episode went down, with Melbourne emerging largely unscathed even while living at Downing Street.)

I can’t entirely pardon Melbourne for some of his asshattery, but Cecil’s careful touch allowed me to understand and even empathize with some of Melbourne’s flaws, for he was also quite idiosyncratic. He stuffed his coats with endless notes. He would emit several strange sounds before beginning to talk. He would shout at random servants and ask them what time it was rather than consult a watch. These quirks allowed him to be liked by the right people, or, perhaps more accurately, tolerated because his actions were so endearingly inexplicable. Perhaps they felt sorry for him because of the Glenarvon episode, although Cecil doesn’t really address how that scandal besmirched his later life, long after Caroline was gone.

What I can say is that Cecil does such a classy job conveying the shenanigans of these often loutish patricians — the rampant adultery, the tolerated insane behavior, the strange manner that all this infiltrated British poetry and politics — that I was placed in the unusual position of fighting strong desires to throw my mind into the mire of the French Revolution, parliamentary protocol, and numerous other subjects. In the last two years, we have seen wild ideological sentiment (exacerbated by Twitter) and staunch stylistic preference (e.g., any polemical book on the lyrical essay) erode the possibilities of understanding human nuance. Melbourne reminds us that the more receptive we are to factual details that trouble or intrigue us, the more willing we are to commiserate with a person’s embarrassing qualities. Perhaps this was one reason why Melbourne was one of John F. Kennedy’s favorite books.

Cecil found both a subject and a tone that recalls John P. Marquand’s Pulitzer Prize-winning 1937 novel, The Late George Apley, published just two years before Cecil’s first half of Melbourne. He tells us baldly at the end that Melbourne’s death caused no great stir during the nineteenth century’s rampantly changing atmosphere. It is a crushing realization. Like Apley, Melbourne outlived his time and took in his personal and professional regrets with a resigned agreeableness. Melbourne’s life is often a sad portrait, yet we are somehow won over by him. Cecil’s book is a welcome reminder that if we’re going to judge someone, maybe we should buffet the impulse to castigate them with a smidgen of kindness, reserving our wrath for the real monsters. Without that vital flexibility that allows us to evolve, there may come a point when we outlive our time too. Sooner than Melbourne.

While our faithful reading adventurer remains stalled on FINNEGANS WAKE on the original Modern Library Reading Challenge, he has decided to start a second challenge. Behold! A new Modern Library Reading Challenge! The top 100 nonfiction books of the 20th century!

Just under three years ago, I began the Modern Library Reading Challenge. It was an ambitious alternative to a spate of eccentric reading challenges then making the rounds. These included such gallant reading missions as the Chunkster, the Three Card Monte/Three Sisters Bronte, the Read All of Shakespeare While Blindfolded Challenge, and the Solzhenitsyn Russian Roulette Challenge. It took a fairly eccentric person to place the literary embouchure ever so nobly to one’s lips and fire off a fusillade of eupohonic Prince Pless bliss into the trenchant air. But I was game.

In my case, the idea was to write at least 1,000 words on each title after reading it. The hope was to fill in significant reading gaps while also cutting an idiosyncratic course across the great works of 20th century literature, with other intrepid readers walking along with me.

Over the next twenty-three months, I steadily worked my way through twenty-three works of fiction. Some of the books were easy to find. Some required elaborate trips to exotic bookstores in far-off states. When I checked out related critical texts and biographies from the New York Public Library, I was often informed by the good librarians at the Mid-Manhattan branch that I was the first soul to take these tomes home in sixteen years. This surprised me. New York was a city with eight million people. Surely there had to be more curiosity seekers investigating these authors. But I discovered that some of these prominent authors had been severely neglected. When I got to The Old Wives’ Tale, Arnold Bennett was so overlooked that I appeared to be the first person to upload a photo of reasonable resolution (which I had taken from a public domain image published in a biography) onto the Internet.

Yet when I told some people about my project, it was considered strange or sinister. When I mentioned the Modern Library Reading Challenge to a much older writer, she was stunned that anyone my age go to the trouble of Lawrence Durrell. (And she liked Durrell!) Her quizzical look led me to wonder if she was going to send me to some shady authority to administer a second opinion.

One of the project’s appeals was its methodical approach: I was determined to read all the books from #100 to #1. This not only provided a healthy discipline, but it ensured that I wouldn’t push the least desired books to the end. Much as life provides you with mostly happy and sometimes unpleasant tasks to fulfill as they arrive, I felt that my reading needed to maintain a similar commitment. This strategy also created a vicarious trajectory for others to follow.

Everything was going well. Very well indeed. Henry Green. Jean Rhys. The pleasant surprise of The Ginger Man. With these authors, how could it not go well? I was poised to read to the finish line. I was zooming my Triumph TR6 down a hilly two-lane highway with a full tank of gas. Cranking loud music. Not a care in the world.

And then I hit Finnegans Wake.

To call Finnegans Wake “difficult” is a woefully insufficient description. This is a book that requires developing an ineluctably Talmudic approach. But I am not easily fazed. Finnegans Wake is truly a book of grand lexical riches, even if I remain permanently stalled within the voluble tendrils of its first 50 pages. I have every intention of making my way through Finnegans Wake. I have reread Dubliners, A Portrait of the Artist as a Young Man, and Ulysses. I have consulted numerous reference texts. I have even listened to all 176 episodes of Frank Delaney’s excellent podcast, Re: Joyce. These have all been pleasant experiences, but I am still not sure if any of this significantly contributes to my understanding of Finnegans Wake. However, when you do something difficult, it is often best to remain somewhat clueless. If you become more aware of how “impossible” something may be, then you may not see it through to the end. Joyce remains the Everest of English literature. I am determined to scale the peak, even if I’m not entirely sure how reliable my gear is.

The regrettable Finnegans Wake business has also meant that the Modern Library Reading Challenge has been stuck on pause. It has been eleven months since I published a new Modern Library installment on these pages. And while I have certainly stayed busy during this time (I have made a documentary about Gary Shteyngart’s blurbs, attempted to fund a national walk that I intend to fulfill one day, canceled and brought back The Bat Segundo Show, and created a new thematic radio program, among other industries), I have long felt that persistent progress — that is, an efflorescent commitment to a regular fount of new material — is the best way to stay in shape and keep any project alive.

I have also had a growing desire to read more nonfiction, especially as the world revealed itself to be a truly maddening and perilous place as the reading challenge unfolded. Some have sought to keep their heads planted beneath the ground like quavering ostriches about all this. There are far too many adults I know, now well in their thirties, who remain distressingly committed to the “La la la I can’t hear you!” school of taking in bad news. But I feel that understanding how historical and social cycles (Vico, natch) cause human souls to saunter down dark and treacherous roads also allows us to comprehend certain truths about our present age. To carry on in the world without a sense of history, without even a cursory understanding of ideas and theories that have been attempted or considered before, is to remain a rotten vegetable during the more important moments of life.

It turns out that the Modern Library has another list of one hundred titles devoted to nonfiction. And the nonfiction list is, to my great surprise, more closely aligned to the fiction list than I anticipated.

In 1998, the Washington Post‘s David Streitfeld revealed that the Modern Library fiction list was plagued by modest scandal. The ten august Modern Library board members behind the fiction list had no knowledge over who had voted for what, why the books were ranked the way they were, or how the list had been composed, with many of the rankings more perfunctory than anyone knew. Brave New World, for example, had muscled its way up to #5, but only because many of the judges believed that it needed to be somewhere on the list.

So when the Modern Library gang devoted its attention to a nonfiction list, it was, as Salon‘s Craig Offman reported, determined not to repeat many of the same mistakes. University of Chicago statistics professor Albert Mandansky was signed on to ensure a more dutiful ranking progress. Younger authors and more women were included among the board. Mandansky went to the trouble of creating a computer algorithm so that there could be no ties. But the new iron fist approach offered some drawbacks. There was a new rule that an author could only have one title on the list, which meant that Edmund Wilson’s To the Finland Station didn’t make the cut. And when the top choice was announced — The Education of Henry Adams — the crowd stayed silent. It was rumored that one board member scandalously played with his dentures as the titles were called.

Perhaps the Modern Library’s second great experiment reveals the unavoidable pointlessness behind these lists. As novelist Richard Powers recently observed in a National Book Critics Circle post, “The reading experiences I value most are the ones that shake me out of my easy aesthetic preferences and make the favorites game feel like a talent show in the Iroquois Theater just before the fire. Give me the not-yet likable, the unhousebroken, something that is going to throw my tastes in a tizzy and make my self-protecting Tops of the Pops slink away in shame.”

On the other hand, if it takes anywhere from five to ten years to get through a hundred titles, then the reader is inured to this problem. Today’s favorites may be tomorrow’s dogs, and yesterday’s lackluster choices may be tomorrow’s crown jewels. As the Modern Library reader grows older, there’s nearly a built-in guarantee that these preordained tastes will become passe at some point. (To wit: Lord David Cecil’s Melbourne, the first book I will be reading for this new challenge, is now out of print.)

So I have decided to take up the second challenge, reading the nonfiction list from #100 to #1. Modern Library Nonfiction Challenge titles shall flow from these pages as I slowly make my way through Finnegans Wake during the first challenge. Hopefully, once the disparity between the two challenges has been worked out, I will eventually make steady progress on the fiction and nonfiction fronts. But the nonfiction challenge won’t be a walk in the park either. It has its own Finnegans Wake at #23. I am certain that Principia Mathematica will come close to destroying my brain. But as I said three years ago, I plan to read forever or die trying.

To prevent confusion for longtime readers, the fiction challenge will be separated from the nonfiction challenge by color. Fiction titles shall carry on in red. Nonfiction titles will be in gold.

I’ve started to read Melbourne and I’m hoping to have the first Modern Library Nonfiction Challenge essay up before the end of the month. This page, much like the fiction list, will serve as an index. I will add the links and the dates as I read the books. I hope that these efforts will inspire more readers to take up the challenge. (And if you do end up reading along, don’t be a stranger!)

* December 15, 2018 Update: While I am striving to read the unabridged versions of all works for this project, upon further reflection, I’ve realized that the cost of obtaining the full 27 volume set of Needham’s opus is well beyond my price range. Each volume ranges from $40 to $200, in part due to the extortionate pricing of Cambridge University Press, a publisher that has proven deaf to my inquiries about obtaining a review copy. This effectively makes this purchase equal to the price of a used car. In addition, it is rather insane for any reader, even one who possesses my ridiculous ambitions, to devote some 8,000 pages to one author. I have reluctantly opted to substitute the five volume Shorter Science and Civilisation in China when I get to this particular essay. As of now, I do have the unabridged The Golden Bough under my belt. And I will spring for the unabridged Toynbee. I hope readers following along can forgive me for cutting corners on one entry. But I do want to complete this project before I depart this earth. And pragmatically speaking, this is the only way to do it.

Subjects Discussed: Perception of time, Walter Clark, pauses and authenticity, Jon Stewart’s 20 second pause in response to Sarah Palin’s “squirmish,” This American Life, Christian Marclay’s The Clock, “Kristen Schaal is a horse,” Tao Lin’s use of repetition, John Boyd’s OODA loop, whether a military strategist’s ideas are entirely applicable to dating, how delay persuades us in other context, the first date as a military tactic, lunch-oriented dating services, making bad snap decisions because of a photo, panic and fast talking, being aware of your audience when talking, the Einstellung effect, Peter McLeod’s experiments with chess players, the three move checkmate, how even chess masters get stuck in the muck, the dangers of being overconfident, unemployment, Sarkozy’s failed efforts to readjust the GDP to help long-term economic impact, readjusting human attention from the short-term solution, cognitive bias, subliminal messages, how fast food logos help to read, SAnford DeVoe’s experiments, racist treatment decisions from doctors, the unanticipated advantages of a spare second, the effects of wealth upon happiness, finding another activity while waiting, viewing time as more scarce and impatience, when scientific developments are at odds with capitalist realities, the downside of success, procrastination, subliminal messages within the film Fight Club, topless women in The Rescuers, when people are vulnerable to subliminal messages, the invention of the Post-It, the advantage of fresh eyes, Archimedes and Newton, Arthur Fry, thin slicing and the Malcolm Gladwell reductionist incarnation of this idea now welcomed by marketing people, Dr. Phil’s incorrect use of thin slicing, and why thin slicing isn’t two seconds according to the studies.

EXCERPT FROM SHOW:

Correspondent: So let’s start off with panic, which seems a very good thing to start off with. Panic, as you say, has much to do with our perception of time. You bring up Walter Clark’s theory — he’s this acting teacher. He says that the best actors are the ones who don’t panic. So how much of our waiting has to do with panic or any other sense of emotional paralysis? How much of our anxieties come from this false comprehension of time? If there’s this correlation between good acting and not panicking, well, I have to ask, Frank, what’s the compromise between being human and being some pretender or some mimic?

Partnoy: Oh, it’s a great question. I’ve learned so much from Walter Clark, who’s one of the best acting coaches I’ve been around. My daughter takes a lot of acting classes. So I’ve learned a lot from him. And I think an acting coach, like somebody who is sophisticated watching a play or a performance, can see through a mimic. You can tell when somebody’s a fake when they’re performing. One of the things that panic does is that it leads people to speed up their performance. So that they run through what the acting coaches call beats. So it’s partly true of acting generally. But it’s especially true of comedy, I think. One of the things that I took away from watching him in action was that a lot of comedy really is about pauses and delays.

Correspondent: Yes.

Partnoy: And understanding the audience and being authentic in your understanding of the audience and figuring out how often to pause. You know, we’re talking right now. We’ve just met each other, right? And we’re sort of watching each other and having this conversation.

Correspondent: And you’re a total phony.

Partnoy: Yeah. Sorry.

Correspondent: Or are you? Maybe I’m the total phony. Who knows? Maybe we’re both being phony. I don’t know.

Partnoy: Hopefully we won’t be as we move along.

Correspondent: I think I can trust you so far.

Partnoy: Alright. Likewise. I’m enjoying it so far.

Correspondent: Okay, good.

Partnoy: I’m grabbing my wallet now. But I do think, just when we start having these conversations in our normal lives, even if we’re not acting that there’s a role of the pause and the delay. That just speeding through something 100 miles an hour is not a very effective communication technique. So one of the things I’ve been interested in for a long time is that. I teach law school classes and my students can’t comprehend me if I’m speaking 100 miles an hour. On the other hand, I can speak pretty quickly and they’ll get content down. They’ll write. So it’s this kind of balance back and forth. And when you panic, you speed up. You speed through the pause. One of the things that I’ve been playing with, as I’ve done three years of research now on the book and wrote it, is how long I can get away with pausing. [short pause] So I talk a little bit about Jon Stewart as an example and this extraordinary moment he had in one of his shows where he had captured Sarah Palin questioning some of the Obama military action in Libya and saying she didn’t know what to call this. “We’re not at war. What’s a word for it? I don’t know the word.” And then Sarah Palin uses this non-word “squirmish.” And for me as a speaker, I would have a hard time waiting, pausing more than a couple of seconds, telling a joke and then delaying. My son actually — I have an eight-year-old son — he’s a lot better at telling a joke and then delaying the punchline. So he’ll make up some joke. “A couple of cantaloupe were married. What did they name their daughter?” And then he’ll do a dramatic pause and say, “Melony.” Which is just made up. But he’ll get a laugh where I’m not sure I can do. But Jon Stewart is able to pause for twenty full seconds. I think that must be some kind of a world record for pauses. And he’s just the opposite of panic. He’s utterly fearless with the audience, feeling them out, understanding and being totally authentic, right? I mean, that’s one of the reasons why we love Jon Stewart so much, is that he’s command of timing and gets us and gets what we want and goes through this kind of time framework, which I think is actually very valuable in all the decisions that we make. Which is a two-step process. The first step is: How long can I wait before taking this action and making this decision? What’s the maximum amount of time that I can wait? And then the second step is delaying until that moment. And so in that example, he decided it was going to be twenty seconds. Probably not consciously. Because he’s a a master. And he was able to wait twenty seconds. I could never do that.

Correspondent: Well, since you brought up pauses, I think we should talk about them.

[pause]

Correspondent: You observe that the best radio announcers and interviewers use them.

Correspondent: Oh. Am I sort of interfering with the question? I don’t know.

Partnoy: Beautifully done. Masterful.

Correspondent: Actually though, I do want to bring this up. I could even bring the William Shatner pause into this equation. But I’m wondering if how we react to a pause shares much in common with how we react to, say, a loop. There’s this comedy routine — I’m not sure if you’re familiar with it — “Kristen Schaal is a Horse” — where basically it just goes on and on and repeats and repeats. It’s basically this woman dancing and a man clapping and going, “Kristen Schaal is a horse! Kristen Schaal is a horse!” And it goes on and loops for like fifteen minutes. There’s a Tao Lin poem where he constantly says the line “the next night we ate whale.” And there are all sorts of repetitions throughout art and culture and so forth. Does the manner in which we ascribe authority to a pause have much in common with this loop situation?

Partnoy: Oh, that’s a fascinating question. I think so. I mean, loops come up in all sorts of contexts and they relate to time in a very fundamental way, right? There’s — I’ll forget the artist, but there’s the 24 hour loop exhibit that’s out now.

Correspondent: Oh yeah. Christian Marclay’s The Clock.

Partnoy: It’s incredible, right? The Clock, where you’ve got, from various films, depictions of 12:01 and 1:05 sort of cycling around. And there’s something really powerful about the reinforcement of the story. A lot of jokes get funnier as they’re retold. So much so that even comedians, they might not even laugh at the joke, but they’ll just think, “Wow, that was really funny.” And loops come up also in a completely different context, I found in my research. Which is in the military.

Correspondent: Mr. Boyd.

Partnoy: Mr. Boyd, right. John Boyd, probably the greatest fighter pilot in history, who created something called the OODA loop. O-O-D-A, for Observe, Orient, Decide, and Act. This approach to decision making started in a military context, but now people use it in all areas of life and business. Where you take time and initially you observe. And you orient. You figure out where the enemy is. And then finally you make the decision. And then the decision is the mental part. And the act is the implementation part. And what John Boyd talks about is running through an OODA loop. So going through that cycle of Observe, Orient, Decide, and Act over and over again, watching the jet fighter you’re trying to shoot down to see what that person’s proclivities are — Do they like to faint to the left? Or the right? How fast are they? — to understand and to confuse them too. Which is also interesting. Because I’m not sure whether the art projects or films that we talked about earlier — I’m not sure they’re really meant to confuse. But in the offensive aspects of the OODA loop, part of what John Boyd is suggesting they do is get a speed advantage to confuse the enemy. And the development of the F-16, he was the person who basically created the idea of the F-16 and pushed its development. The kind of aircraft that’s like using a switchblade in a knife fight, that you can use very quickly to confuse and disorient your opponent. So these loops show up. Expertise, if you think about it. Where does expertise come from? It comes from a kind of repeated loop, right? Chess players become experts by learning openings and repeating that over and over and over again and seeing certain patterns. What behavioralists call chunking. Being able, because they’ve been through those loops so many times, to recognize patterns consistently. So it’s a really interesting question. And I think to some extent, these really deep insights and expertise come out of repeated loops as well.