The narrator, Lucius Priest, is an old man recounting adventures from when he was an eleven-year-old boy, in 1905, just as automobiles first arrived in Jefferson, Mississippi. His grandfather, Boss Priest, who owns one of the few cars then in existence, goes by train to a distant funeral, leaving Lucius to enter into an unspoken pact with his grandfather’s driver (and distant relative), Boon Hogganbeck. They conspire to steal the car and take it to Memphis, Tennessee, where gambling, scams, and prostitution await. To “reive”, by the way, is to steal, hence the “reivers”.

It remain a powerful narrative and worthwhile read for several reasons:

It is a brilliant coming-of-age story, a bildungsroman, incisively examining the miracle of maturation and personal growth. The novel produces a man from a boy, like transmuting lead into gold, in a way that gives insight into life. The transformation contains all the fits and starts that a growing conscience exhibits as it encounters life’s complexities, and the lessons of code, conduct, and character that the young boy infers from circumstance somehow always exceed their proximate causes. In this way, it is a morally impressive novel. The mental forces that the narrator refers to as “Virtue” and “Non-virtue” appear early: Lucius knows, from the beginning, that with a single word he could command Boon to take him back home, and that not only would Boon obey, but he would be relieved. And yet he knows that he is too far gone for this, and must redeem himself some other way that lies in moving forward and not in retreat. The circumstances are specific, but the lessons learned are universal, and the clarity by which Faulkner renders the former into the latter contributes no small part to the power of this novel. The education is naturalistic; not by rote and certainly not by example, but by the mere exposure to the challenges of life itself, Lucius enters adulthood.

It is stylistically beautiful. It reads like a yarn, a rye-soaked conversation, meandering through generations. The narrator addresses the reader as a younger generation of the family, and thereby deepens the atmosphere. The device allows the narration to contrast the conditions of 1905 to the 1950s, when the story is being told. Yet to provide background the narrator also travels much further back in time. Especially in the beginning, it snakes through the nineteenth century as a prologue to the action, at a slow pace, filled with intriguing characters, often in such a way that one cannot quite make out where the stories are coming from or going to. The narrator hauls aboard every strand in sight which might later come in handy, yet through this discursiveness such life comes through as to keep the journey extremely enjoyable.

The characters are complex, sometimes enigmatic, and far from typecast. Beyond Lucius’ education and Boon’s mixed motivations, there’s the unexpected combination of folly and street wisdom in the wily Ned, as well as the mixed figures of Boss (the patriarchal grandfather) and Everbe (a partially-reformed prostitute). These are surprising characters, and their personalities are revealed not by their consistency but in moments of unpredictability, as when Everbe breaks down at Lucius’ defense of her, or the family’s treatment of Lucius’ outrageous disobedience in the conclusion. The possible exception to this is Otis, a deplorable nephew of one of the prostitutes, who acts as a foil to Lucius’ growing moral certitude.

It shows Faulkner’s versatility and narrative skill without resort to the virtuosity of his earlier work. Not that there’s anything wrong with it—As I Lay Dyingand The Sound and the Fury are themselves baroque masterpieces. But this novel impresses by showing his virtues in a completely different mode, without resort to his surfeit of stylistic devices, to stream-of-consciousness, or to his voluminous vocabulary. The language remains first-rate, with some astoundingly interesting colloquy; what surprises more is its simplicity, in part attributable to the youthful perspective of the protagonist. Faulkner retains all the emotional force of his earlier novels despite assuming an entirely different style.

It’s funny. I won’t spoil its multitude of entertainments but here’s a passage that I particularly liked:

Even if Ned (or somebody else concerned) asked him point-blank if I was with him Saturday night, it would be at least Monday by then, and I had already decided quick and hard not to think about Monday. You see, if only people didn’t refuse quick and hard to think about next Monday, Virtue wouldn’t have such a hard and thankless time of it.

]]>http://msls.net/2018/08/14/the-reivers/feed/0Kudoshttp://msls.net/2018/08/06/kudos/
http://msls.net/2018/08/06/kudos/#respondMon, 06 Aug 2018 19:00:02 +0000http://msls.net/?p=733After discovering her piece Aftermath a few years ago, and having revisited it several times since, I’ve been an adherent of Rachel Cusk’s. I am tempted to re-read that piece now, but I know that its stark, hypnotic beauty would move me too much and prevent me from writing anything about her latest book. Kudos is the third in a trilogy of what feel like semi-autobiographical novels, featuring a painfully vacant writer (and absent mother) who is going through the motions on the literary festival circuit, rarely speaking, as those around her uncontrollably pour confessions, philosophies, and personal quandaries upon her as if she’s their unpaid therapist.

The narrator’s silence against this barrage of monologues is presumably a response to the vicious backlash that Cusk’s earlier honesty about her divorce had occasioned, and the stream-of-monologue style defies forming a clear vision of exactly what is contained in each volume. Though she gags her protagonist, Cusk’s powerfully idiosyncratic ideas and images fall from the mouths of other authors, interviewers, or sometimes just men sat next to her on the plane. Sometimes these are profound reflections on life, other times outright provocations, and occasionally they stun with their clarity, beauty, and their synaesthetic sense of metaphor. On the occasions when it seems that the protagonist must speak, for example when she’s on a panel, sometimes the narrative skips conveniently over it, omitting her speech. In more egregious cases, as with the many interviewers in dereliction of their duty, she is not given any time to speak; sometimes the interview finishes without a single question having been asked. And yet the insight comes through, as if the characters are created from thin air to safely test out ideas in other voices, some very good, others repugnant, while the protagonist remains receptive, or at least rarely contradictory, throughout.

I have a theory, probably wrong, that this trilogy represents a reflection on the Three Characteristics of Buddhism. Everything we experience in life, every object, experience, thought, emotion, and even consciousness itself, according to this thinking, is characterised by the fact that they cannot be said to represent the true self, that they are impermanent, and that they are unsatisfactory. The titles somewhat align with this theory. It seems plausible to me that Outline, introducing the absent narrator onto whom everyone projects and who herself remains hollow, could represent the no-self doctrine. Transit, with its half-constructed house, could represent the impermanence of all goals and experiences. And Kudos (sarcastically) with its sometimes extreme emotional violence could represent suffering (which, in Buddhism, is caused by desire). Many of the insights voiced by the characters seem to be engaged with these topics: What is the self? How can one account for the ways in which one sometimes changes drastically over time? Does anything last, or is everything constantly in flux? Is there anything worth preserving? What, if anything, is worth striving for? Is all satisfaction transitory? In short they are a search for meaning. The characters are constantly on the verge of, and sometimes walk into, deep insights into these questions. The originality of this torrent of viewpoints is impressive. But its multiplicity sometimes overwhelms, and can leave one with a feeling of vagueness that’s as strong as the narrator’s.

]]>http://msls.net/2018/08/06/kudos/feed/0Circehttp://msls.net/2018/08/01/circe/
http://msls.net/2018/08/01/circe/#respondWed, 01 Aug 2018 11:29:17 +0000http://msls.net/?p=730Madeline Miller’s Circe is not a bad book, but it is disappointing in a number of ways. It takes for its first-person hero the witch of Aiaiai, Circe, a daughter of the sun-god Helios, turner of men into pigs, and eventual lover of Odysseus. It is a sort of riposte to The Odyssey so it’s unsurprising that it takes the time to dismantle the old heroes and gods one by one: Odysseus, Achilles, Hermes, Athena, Helios, most of the other nymphs, demigods, and gods, are portrayed as frivolous and vain, as well as Jason, Herakles, Ajax, who are depicted (amusingly) as hulking bores. This is all fair enough; the gods are mercurial and immature at the best of times, and quite pathetically petty at their worst, and the heroes are nothing if not unwise. On this level her treatment is welcome, humanising and critical of the often misogynistic and merely vacuous penchants of the gods. Her accounts of Prometheus, of the halls of Helios, of Daedalus, and of obtaining the Trigon’s tail intermittently rise to imaginatively rich visions. What is surprising is how dull Circe herself is. She rarely leaves the island on which she is imprisoned, remaining preoccupied with age-old resentments, a painful and impotent motherhood, and quite a lot of domestic duties for someone who might be expected to employ witchcraft to relieve herself (and the reader) of such minutiae. This leaves the novel as a self-conscious contrast to the exciting adventures undertaken by Odysseus, in what seems to me a rather anti-feminist approach to the character. It reminded me of nothing so much as Beckett’s Malone Dies, wherein a senile man awakens in an unfamiliar room and never leaves it, spending the entire novel reflecting on (or possibly inventing) grievances from the distant past. One gets the sense, early on, that Miller regards Homer as being akin to Helios, herself akin to Circe, and the consequent feeling of grievance and victimhood, of the triviality of mortality but also of her part in perpetuating mortal trivialities, never really leaves the novel. The monotonous style, alternating among about three sentence structures (and using “for” as a conjunction far too often) can make Circe’s complaints sound a drumbeat. By subordinating herself to Homer’s immortality, Miller dooms her own book to the position in which it ultimately finds itself: as a derivative hanger-on to the genius of gods. But there’s no reason it need be this way. One could imagine another Circe, less woe-is-me and plagued by doubt, more assertive and subversive, simply more nuanced, in her often courageous dealings with the gods. Instead we’re left with a fairly one-sided embittered exile, who undermines herself even in her rare moments of strength.
]]>http://msls.net/2018/08/01/circe/feed/012 Rules for Lifehttp://msls.net/2018/07/18/12-rules-for-life/
http://msls.net/2018/07/18/12-rules-for-life/#respondWed, 18 Jul 2018 21:32:17 +0000http://msls.net/?p=724You probably shouldn’t read Jordan Peterson’s 12 Rules for Life: An Antidote to Chaos. I’m not normally one to discourage reading (or intellectual endeavour), but this is a strange exception. It’s not exactly that Peterson is wrong about anything specific, although he occasionally very much is. It is rather that on many topics, he is right, but his extreme confidence in mixing many correct observations with some incorrect ones, combined with his bleak view of humanity make its potential for harming your worldview outweigh its potential for improving your life. His unwavering certitude is one reason I recommend against reading it: someone impressionable might not be able to distinguish between where his views are mainstream versus where they are highly dubious, to say nothing of how strident and repetitive his writing can be. (Most of the best things he says are said more eloquently elsewhere.) But my primary objection is with a sort of self-contradiction that exists in his ideas.

First, let me list its virtues: as I said, he is right about many facets of human nature, he is certain that he is right, and he is often often blunt in his delivery, and, in moderation, this can be refreshing. I do feel that the world, and certainly airplanes, would be a better place if parents read his chapter on child rearing. His ideas about the balance of order and chaos, while somewhat Manichean, are also somewhat interesting. Several chapters in the first half of the book represent a call to action and to personal responsibility, and in places it becomes effective motivational writing which might prompt one to be a slightly better person, though the advice is rarely concrete enough to put directly into practice. Although he often frames things in rather grim zero-sum terms, and fixates rather unhelpfully on social dominance hierarchies, the overall focus is on improving oneself, which is an admirable goal, and an important aspect of achieving any other worthwhile goal. Underlying all this, however, is a deep misanthropy and Social Darwinism, of which there are hints in the first half, and which come to dominate the second half. It is not exactly that he is wrong about human nature; it is actually that he is probably right about many of humanity’s shortcomings, and yet that does not justify miring oneself in them.

Peterson claims that “What you aim at determines what you see,” and this is true. He argues that not only goals but also tools become extensions to the self, and lenses through which one views the world (his views on tools are, as far as I can tell, straight from Heidegger). Yet he came to his own search for meaning first by the threat of nuclear annihilation during the Cold War, and later by a thorough investigation into the suffering in the Soviet Gulags and the Nazi concentration camps, later throwing a few school shootings into the mix (he has a preoccupation with Columbine). It is not that these topics should be excluded from one’s knowledge; education on these topics is certainly critical to preventing their occurrence in the future, and it is almost a moral obligation to learn about them. However this is hardly the place to start a search for meaning, much less to form one’s worldview or to find a set of rules by which to live one’s life. He tries to ground his philosophy in the reduction of suffering, and he is right that suffering teaches important lessons. Most of the world religions made this point two and a half thousand years ago. But his emphasis on suffering shows a rather fatalistic lack of hope for its amelioration. Buddhism’s first claim, as Peterson points out, is that life is suffering, but he totally ignores the fact that the three remaining Noble Truths are about how to understand suffering and how to end it. Peterson seems to stop at “Life is suffering, so let’s maybe try not to increase it,” which is not a particularly helpful position.

As a clinical psychologist, you might expect from Peterson the profound level of empathy and painfully accurate insights into human relationships offered by such an expert as Esther Perel, but far from any understanding of the difficulties people face he focuses almost exclusively on people’s frailties. His misogynistic musing on the causes of failed marriages is particularly repellent. Peterson’s refusal to accept excuses works well in his clarion call for readers to take responsibility, but comes off as victim-blaming when he looks at why relationships fail. This accusatory style runs throughout the book. Let me quote Peterson both to give an example of this style and to try to point to the tension that I feel damages the book:

But Christ himself, you might object, befriended tax-collectors and prostitutes. How dare I cast aspersions on the motives of those who are trying to help? But Christ was the archetypal perfect man. And you’re you. How do you know that your attempts to pull someone up won’t instead bring them—or you—further down? Imagine the case of someone supervising an exceptional team of workers, all of them striving towards a collectively held goal; imagine them hard-working, brilliant, creative and unified. But the person supervising is also responsible for someone troubled, who is performing poorly, elsewhere. In a fit of inspiration, the well-meaning manager moves that problematic person into the midst of his stellar team, hoping to improve him by example. What happens?—and the psychological literature is clear on this point. Does the errant interloper immediately straighten up and fly right? No. Instead, the entire team degenerates. The newcomer remains cynical, arrogant and neurotic. […] The delinquency spreads, not the stability. Down is a lot easier than up.​

Peterson is something like the newcomer in his own parable.​ His fixation on life’s suffering, human weakness, violence, and dominance represent a downward gaze that is not particularly helpful for improving one’s life. To put such emphasis on life’s hardships and especially on social dominance hierarchies seems to me likely to increase aggression, competition, and generally make one a worse person. While reading this book I perceived more hostility and threats from others, and I felt more competitive than I normally do, and not in a particularly helpful way. Peterson has clearly dwelt deeply on the darkness of humanity and on his own darkness. This is a side of human nature which any critical thinker, possibly any moral human, will have to grapple with. But if you want to improve your life, your time is better spent elsewhere.

]]>http://msls.net/2018/07/18/12-rules-for-life/feed/0Mrs. Dallowayhttp://msls.net/2018/07/09/mrs-dalloway/
http://msls.net/2018/07/09/mrs-dalloway/#respondMon, 09 Jul 2018 13:31:48 +0000http://msls.net/?p=717It is difficult to pinpoint exactly why Mrs. Dalloway is so heartbreaking. Some of it is undoubtedly down to its manner of plumbing the depths of time, the way in which those strong moments of life, of violence and of youth, of youth’s violence, can so resolutely stand time’s test, can so indelibly inscribe one’s present, can last so long into age, can remain more real to us than our everyday existence; more familiar, sometimes, those faces from one’s youth, than the deepening lines in the mirror make us to ourselves. It conveys precisely how certain acquaintances can be cut for years, even decades, without their bonds on us weakening in the slightest. One might be forgiven for assuming, as I had, that this manner of recollection would come across as stream of consciousness. But the novel’s profound consciousness is not streamlike. As a cursory introspection into one’s own mind reveals, there is really nothing continuous about it at all, and it is its discontinuity, and, in sensitive and damaged individuals, its total susceptibility to environment, that is its defining characteristic. The superficial continuity hides an infinite well of stored impressions that might surface unbidden at any moment. A sudden bell, a sudden memory, a sudden recollection of enmity, can throw the most ordered thoughts into chaotic disarray, into unexpected joy or into paroxysms of regret. Or even both, at the same time. The recollection of one’s own ugliness can overwhelm one’s mind to the point that one resolves to concentrate it upon any other object, to affix it to anything just until the next pillar-box; and yet one’s thoughts fly on, masterless, bounding, and one only notices that they have gone once one has long since passed the pillar-box. It is this susceptibility to mental projections and physical environs, sensitivity to one’s thoughts and to the world, and above all a cataclysmic lack of concentration that characterises not only Clarissa and Peter, Septimus and Rezia, but modernity itself. Each character struggles against not only an ever-quickening torrent of sensory stimulus, but also against an uncontrollable inner monologue, a deluge not only of present bustle but, in times of stillness, an uncontrollable vacillation into the future, to death, and even more frequently back into the recesses of the past, into the knowledge that not taking roads never leaves them fully untaken, and into the ever-present spectre of the previous century, which often looms larger than the more recent war. The pitiability of available defences against this onslaught, like a Franco-Prussian battlement against a First World War howitzer, is of course most agonisingly illustrated in Septimus’ shell shock—six months’ rest at best, total denial of any problem at worst, but the recommendations are wholly incommensurate to the problem. It also evident in the pathetic fortifications erected by Miss Kilman’s religion against her own vicious and compulsive self-flagellation. Nor are the rich spared; at the party one can see the deleterious lack of mental reserve in Peter’s judgement and impatience, in Clarissa’s wild oscillations of mood, and in Sally’s painfully pertinent question, near the end, in the course of seemingly inconsequential gossip: “Are we not all prisoners?” The lack of useful tools in the face of modernity is as evident here, in 1925, as it would be half a century later, when Pink Floyd would write that “hanging on in quiet desperation is the English way,” or as it is now, in any article about the pernicious effects of today’s distractions. The twentieth century had already stretched human nature to the breaking point, and in many cases past it, to say nothing of the present. And yet for all this, it is not a depressing book to read. Painful reserve, even, has its painful beauty, as when Richard, bringing flowers, finds himself wholly unable, in the most British scene imaginable, to tell his wife that he loves her. Painful openness too, when Clarissa, amidst the roar, realises that this “was what she loved; life; London; this moment of June.” Although it sometimes errs on the painful side of painfully beautiful, it is nonetheless beautiful. Read it with whatever guard you’ve erected down.
]]>http://msls.net/2018/07/09/mrs-dalloway/feed/0Bullshit Jobshttp://msls.net/2018/07/06/bullshit-jobs/
http://msls.net/2018/07/06/bullshit-jobs/#respondFri, 06 Jul 2018 16:28:05 +0000http://msls.net/?p=712First, David Graeber’s Bullshit Jobs is an extremely pleasurable read, and you should read it, if nothing else for the accounts of the utterly useless things people have been employed to do. The book was born in the wake of the storm of Graeber’s 2013 article “On the Phenomenon of Bullshit Jobs“. The premise is simple: In 1930 John Maynard Keynes predicted, with the pace of mechanisation and technological advances, that by the end of the century the world would enjoy a 15-hour work week. Given the endless, inescapable, invariably tedious discussions of automation and AI, why hasn’t this happened? The short version of this book is: it has. The reason that it doesn’t appear to have happened is because the remaining twenty-five hours (or in more dire new-world cases, sixty-five hours) have been filled with unnecessary admin and bureaucracy, with some of the worst jobs (from the soul’s point of view) concerned exclusively with increasing that burden. Sound fanciful? The argument is premised mostly on empirical data, self-reported by the people actually doing these jobs. (It also lines up well discussion I’ve had with people in many industries.) Polls in the UK and the Netherlands have shown 37 to 40% of people do not believe, by their own estimation, that their job contributes anything useful to their company or to society. How can this be? Isn’t this impossible under capitalism?

The book, using an array of amusing, shocking, and tragicomic anecdotes, makes the case that many jobs deemed to be bullshit by the people working them exist for reasons other than economic efficiency or expediency. It must be understood that “bullshit jobs” does not refer to unpleasant jobs, almost all of which are extremely necessary and not bullshit at all: cleaners, plumbers, sewage workers, and so on, virtually never rate their own jobs as “bullshit.” Nor is it a value judgement on whether other peolple’s jobs ought to exist or not. It uses the workers’ own testimonies, which seems legitimate (who would claim that their job is bullshit if it’s actually useful?). And it turns out that a significant proportion of middle managers, administrators, and IT workers report that their jobs are either entirely meaningless or have large meaningless components to them. The book examines the increase in administration and bureaucracy of jobs like nursing and teaching—which it calls the “bullshitization of real jobs”. As an example, universities now employ something like triple the administrative staff that they did in the 1960s, but far from reducing the administrative burden on professors and students, they have drastically increased it.

Based on hundreds of reports, Graeber divides the jobs into five categories. The first is “flunkies,” who are there to make other people look important. This can happen in tiny operations hiring idle receptionists just to be taken seriously, or it can happen in big corporations whose upper management can hardly seem very senior without big teams working beneath them. It often happens because people are hired before it is precisely determined what they will do. The second category is “goons,” who exist primarily to keep up with competitors who also employ them, like telemarketers or corporate lawyers. Their job involves some level of aggression that would not be needed if everyone just stopped doing it. “Duct tapers” are cheap labour hired to permanent roles to constantly fix problems solved by not addressing root cause issues, which happens often in IT—but can also happen in areas like customer service, where people are employed to apologise for things being broken rather than to actually do anything about it. “Box tickers” are responsible for keeping up the appearance that certain regulations are being met, when in fact they are not (think compliance departments). “Taskmasters” apply mainly to middle management. He divides them into two categories, the merely useless ones who are like reverse flunkies (i.e. their teams did their jobs just fine before they were hired), and the more sadistic type who invent bullshit, exact punitive timekeeping or performance metrics, and so on. In my experience the most benevolent people in these positions believe their job is mainly to shield their teams from bullshit coming from above.

The book makes a convincing argument that there is an inverse relationship between how highly one is compensated, and how directly useful their job is. The remaining chapters are devoted to why people in such positions find it so soul-destroying, what it is like to work them, why they are proliferating, why society has not objected to the situation, and finally to what might be done about the situation. Along the way Graeber discusses manifold other topics, including what happened to the apprentice-journeyman-master system, how global finance might be seen as a new type of feudalism, and how the dominant labour theory of value lost out to a largely silent capitalist coup in the late nineteenth century. There is far more in this book than I’ve summarised here, but it has a lighter tone and is less formidable and more comical than Graeber’s excellent book Debt: The First 5000 Years, which I also recently read. Overall it’s an entertaining explication of a very real problem, and it is is well-worth reading.

“Enchanting” and its close cousin “charming” are apt words for Elizabeth von Arnim’s novel The Enchanted April. It’s an outwardly unassuming meditation on how one’s surroundings can change one’s mind, and gives a fair amount of early (1922) insight into British attitudes towards the rigidity of society, as well as to the ameliorative effects both of holidays (and may even give some insight into today’s music festivals). The relaxation of strictures and class stratification empowers not just the destination sun but even the act of leaving England with an enchanting quality that slowly but surely changes its characters. In the book these qualities actually line up well with American philosopher William James’ categorisation of mystical experiences. First, he calls them ineffable, and indeed the characters have a difficult time putting into words precisely what is happening to them or what it is about the setting that is quite so transformative; they merely keep repeating the the name of the place, “San Salvatore,” which doesn’t really explain anything, though one gradually gathers an empirical understanding of its meaning. Second, he calls them noetic, meaning that they seem to reveal truth. Most of the characters feel that something inside them is awakening which is more true than their previous lives. Third, they are transient, and cannot be sustained for long. Although here the experience lasts a month, the characters worry that the effects will dissipate on their return to London. Fourth, they are passive. Certainly Mrs. Wilkins and the others feel like it is the environment acting upon them rather than vice-versa. In other words their experience is a retreat of sorts, leading to a quasi-religious transformation, with Mrs. Wilkins becoming saint-like, a more powerful harmony with nature, raptures of gratitude, the dissolving of old selves. This is an interesting representation of the effects of a new environment, representing a kind of primeval British holiday, providing what holidays were invoked to provide: not just a break and refreshment, but rediscovery, renewed vigour, and a new love for life, caused by the beauty and unfamiliarity of a new place, which one hopes will persist and bleed over into the everyday. It must also be said that certain scenes in this book are absolutely hilarious. An enchanting read.

]]>http://msls.net/2018/06/30/the-enchanted-april/feed/0How to Change Your Mindhttp://msls.net/2018/06/29/how-to-change-your-mind/
http://msls.net/2018/06/29/how-to-change-your-mind/#respondFri, 29 Jun 2018 20:40:12 +0000http://msls.net/?p=695A few weeks ago I was fortunate to see Michael Pollan talk about his new book, How to Change Your Mind. He was interviewed by author Zoe Cormier, at a co-working space called Second Home in East London. Pollan is best known for books on food, including the excellent Cooked(2013), the first book of his that I read (and reviewed here). This led me to his earlier books The Omnivore’s Dilemma (2006) and The Botany of Desire(2001). Pollan views himself not strictly as a food writer, but as having written on food out of a broader interest in the ways in which humans interact with nature; it just so happens that agriculture is one of the most consequential ways that we do so. His earlier books were provocative and mind-opening; they changed what I ate and how I cooked. His new book seeks to open vistas of the mind in a different way. Ambitiously subtitled “What the New Science of Psychedelics Teaches Us About Consciousness, Dying, Addiction, Depression, and Transcendence,” the book largely delivers on its wide remit, and I would recommend it to anyone, regardless of prior interest on the topic.

The book is perhaps foremost a history of the so-called “psychedelic” drugs, and specifically what are known as the “classical psychedelics.” The term “psychedelic” was coined by the psychiatrist Humphry Osmond in 1957, to mean “mind-manifesting”. This term caught on, unlike competing names for these drugs, which included “psychotomimetic” (imitating psychosis) and “psycholytic” (mind-loosening). The best-known examples of classical psychedelics are LSD and magic mushrooms—mushrooms from the genus psilocybes, also called psilocybin mushrooms after one of their psychoactive molecules. The class also includes potentially less familiar drugs like ayahuasca and DMT. Although some researchers opt to include MDMA (the amphetamine found in ecstasy), Pollan does not, as it operates quite differently from the others, which all act as strong partial agonists on specific serotonin (5-HT2A) receptors. The classical psychedelics have effects which are similar enough to each other to discuss them as close equivalents.

Osmond, who came up with the name “psychedelic”, is just one of a plethora of interesting characters described in this book. Not only was he the man to give Aldous Huxley the dosage of mescaline immortalised in his essay “The Doors of Perception” in 1954 (pdf)—after which the band The Doors was named—but he also gave British politician Christopher Mayhew mescaline in a session recorded with the intention of broadcasting it on the BBC’s Panorama in 1956. (Ultimately it was deemed too controversial to air, but you can read the amusing transcript here.) The drugs’ powerful effects seem to have attracted, inspired, and sometimes permanently altered many brilliant and unusual minds, from the moment they were introduced to the West some seventy years ago to the present day—with a rather long break in the middle due to aggressive legislation.

Few are likely to know the early history of psychedelics, or how promising and successful they had been in the psychiatric practice of the 1950s. A good portion of the book is devoted to covering the many high-quality studies performed with the drugs in the 1950s, before the exuberance of some (in particular Timothy Leary) led to the drugs being banned. This was a history mostly unknown to me, and apparently to Pollan, who discusses the precipitous rise and ignominious fall of this research at some length. Unlike much of the colourful cast that he depicts, Pollan himself was, when he began writing, inexperienced with psychedelics, and in fact quite hesitant to try them. Having been born in the mid-1950s, Pollan had come of age in the wake of the all-out offensive that characterised the start of the War on Drugs. Nixon called drug abuse “public enemy number one” and went so far as to single Leary out as “the most dangerous man in America.” The message was stark enough that many of Pollan’s generation took the terrifying rhetoric at face value, believing the effects of psychedelics to include madness, blindness, suicide, and other calamitous consequences. Misconceptions fuelled by the early propaganda persist, even among the medical and psychiatric communities, to this day. So whence his interest? Pollan was piqued after reading a 2010 article in the New York Times which described NYU and Johns Hopkins studies using magic mushrooms. These studies sought to determine whether the mushrooms might be able to alleviate “existential distress” in patients who had been given grave or even terminal cancer diagnoses. As a result of his impressions about the psychological dangers of this class of drugs, Pollan was surprised to learn that psychedelics were being given to patients already in the grip of mortal terror.

As it happens, a man named Patrick Mettes, diagnosed with terminal cancer of the bile ducts, read precisely the same article in 2010, and managed to get himself a place in the NYU study. What the study showed was that a single high dosage of mushrooms, in a careful session with two therapists, could reliably alleviate the patients’ terror at facing imminent death. These sessions were not “psycholitic”—an earlier name for the psychedelic drugs that indicated the fact that, in low dosages, they can make one malleable for more traditional types of psychoanalysis or psychotherapy. Rather they involved high doses, and the therapists provided instructions and something more akin to moral support than therapy, not intervening during the trip unless the patient was in distress. (This would strictly apply to mental distress; the drug are remarkably non-toxic and, as with cannabis, there is no known lethal dose.) The patients thereafter underwent their own internal journeys, accompanied by music and shielded from the perceptual distractions of the outside world by eyeshades. They ventured into their mind, into their fear, and, often, into their cancer—and emerged, not terrified, but miraculously relieved. Pollan’s account of Patrick’s transformation, from a man in the depths of existential anguish to the radiantly beaming man who was consoling his therapists is incredible. Two months after his treatment, and about a year before his eventual death from his cancer, Patrick reported being the happiest he had been in his life.

As a result of this powerful story, Pollan wrote a piece in 2015 in the New Yorker, which seemed to hit a nerve. Pollan made plans to expand his inquiry into the history and efficacy of these drugs, and along the way, resolved, not without reluctance, to take not only magic mushrooms and LSD, but potentially less familiar drugs in the same class like ayahuasca and the extremely potent 5-MeO-DMT (procured from the dried venom of the Sonora desert toad).

So what do the classical psychedelics do? The question turns out to be difficult to answer concisely. Their popular image, as producing powerful visual hallucinations, while not inaccurate, does not do justice to their extremely varied effects. Stanislav Grof, a Czech psychiatrist who was important in the drugs’ early therapeutic studies during the time when they were legal, called them “unspecific amplifiers”, meaning that they would intensify emotional or mental processes indiscriminately. This is very different from most other psychoactive drugs, which tend to produce relatively predictable effects. More recently, Roland Griffiths showed in 2006 that psilocybin mushrooms can reliably occasion mystical experiences in “healthy normals”, rather fascinatingly producing experiences that rank among the most meaningful in healthy subjects’ lives (comparable to the death of a parent or the birth of a child). His TEDMED talk is well-worth a watch. Even newer research by addiction psychiatrist Judson Brewer intriguingly showed that fMRI scans of those on psychedelic drugs look very similar to those of advanced meditators, with both exhibiting reductions of activity in the “default mode network.” The decrease of activity in this network may be central to the drugs’ effects, which is critical as there is increasing evidence that this network is implicated in mental illnesses like anxiety and depression. The DMN, as it is abbreviated, is strongly involved in producing one’s sense of self, in memories and forecasting, in judging social situations, and in imagining what others might be thinking. This leads on to one of the most fascinating accounts of what these drugs do, in a paper by a researcher at Imperial College London named Robin Carhart-Harris. This paper makes a strong and utterly fascinating case that the psychedelic state may be similar to childhood consciousness, and even that it may be a sort of polar opposite to the type of consciousness found in depression, addiction, and obsession. I cannot do the paper justice here; Pollan’s summary is tantalising, but the paper itself is one of the most fascinating I have read.

To give an overview of the drugs’ history, LSD was discovered in Switzerland in the 1930s, manufactured in the 1940s, in wide medical use in the 1950s, part of the culture in the 1960s, and banned in the US in 1971. Magic mushrooms were used for hundreds or even thousands of years in Southern Mexico, but had been suppressed (though never wholly eliminated) by the Spanish missionaries. In one of the many surreal stories in this book, these mushrooms were first brought to the attention of the world by J.P. Morgan banker R. Gordon Wasson, following a disagreement with his Russian wife over the safety of wild mushrooms. They noticed that different cultures tend to exhibit either mycophilia or mycophobia (love or fear of mushrooms), and this discussion ultimately led to the couple becoming expert mycologists. After hearing rumours of ancient Mexican mushroom ceremonies still taking place, in 1955 Wasson travelled to Mexico and was allowed to partake. Two years later he published a widely-read account of his experiences in Life magazine. This led to an invasion of the previously secluded community first by beatniks and later by hippies, not to mention Bob Dylan, Mick Jagger, and John Lennon, and later to the identification of other psilocybes mushrooms (of which there are dozens of varieties growing all over the world). Their active compounds psilocybin and psilocin, interestingly, were identified in 1959 by the same Swiss scientist—Albert Hoffman—who had discovered LSD.

Most will have an impression of the role of these drugs went on to play in the heady days of the late 1960s, and Pollan treats this period in illuminating detail. One of his most interesting contentions is that the psychedelic trip, when first embarked upon by the youth of the 1960s, was not just a new rite of passage, but a new type of rite of passage. Normally, he argues, these rituals (like a bar mitzvah) are sanctioned by adults, and represent a transition from adolescence into adulthood. In the 1960s, young users of LSD and magic mushrooms in the West were venturing into territories totally uncharted by the older generation. Moreover they were entering this terra incognita without any moral framework or therapeutic oversight. Guidance and rituals had always accompanied magic mushrooms when taken in Southern Mexico, as well as most other examples of psychedlics used in pre-modern times, as, for example, seems to have been the case in the Eleusinian Mysteries of Ancient Greece. Lacking such a cultural context, Pollan argues, the youth of the 1960s were not only entering states of which their elders knew nothing at all, but they were doing so in a way that would make them even more susceptible to the potential dangers and often overpowering effects of these substances. All of this combined to create a great deal of fear.

Nixon’s reaction was cataclysmic, and the effects of the War on Drug are of course being felt throughout the Americas and the world to this day. However, psychedelics show signs that they may, in the near future, be decriminalised in the US. Unlike the populist movements which resulted in widespread marijuana legalisations, it looks more likely that psychedelics will be approved through the traditional FDA route, and quite possibly be used in the near future to treat depression. Rick Doblin has already been successful in trialling MDMA for the treatment of PTSD, and his organisation MAPS (Multidisciplinary Association for Psychedelic Studies) is devoted to the legal study of psychedelic drugs. Other studies have shown psilocybin’s efficacy in helping people quit smoking, with a success rate that is currently much higher than any alternatives. All of this new research is happening alongside a re-examination of the old research, for example studies showing evidence that LSD can help alcoholics stop drinking; however, these studies were often not done to today’s standards, in particular failing to be double-blinded. Surprisingly, the powerful and obvious effects of psychedelics still present methodological problems for science today.

Several chapters of the book are devoted to Pollan’s own “trips.” One of the hallmarks of the psychedelic experience, as a subset of William James’ formulation of mystical experiences from which they are indistinguishable, is that they are ineffable. Pollan jokes that he has nonetheless “effed” them, and indeed he has done so with verve and clarity, despite the fact that the inadequacy of re-telling such experiences inevitably gives the sense of having done them injustice or even violence, and the added danger that the powerful experiences can turn to platitudes on the page. I won’t further diminish Pollan’s experiences by any summary of my own, but can highly recommend reading the book. I can also recommend his appearances on NPR, Sam Harris’ podcast, on Science Friday, and perhaps especially on Tim Ferriss’ podcast, all of which are fascinating. Pollan’s books have the great virtue that all of them are likely to change your mind; this one lives up to its title.

]]>http://msls.net/2018/06/29/how-to-change-your-mind/feed/0Flourish (2011)http://msls.net/2018/06/28/flourish-2011/
http://msls.net/2018/06/28/flourish-2011/#respondThu, 28 Jun 2018 08:48:37 +0000http://msls.net/?p=691Psychologist Martin Seligman’s Flourish is a strange book, in that it does not deliver on any of its promises, and yet somehow remains enjoyable. You would be forgiven for assuming, given the book’s rather bold opening that “This book will help you flourish,” that the book will in fact help you flourish—which it does, sort of. The early chapters, after promising then not delivering many practical exercises, then seem to imply that the book will instead summarise developments of positive psychology beyond its original scope of “authentic happiness”—which it does, sort of. The rest is part intriguing memoir, part summary of where psychology and philosophy went wrong in the twentieth century, and part discussion of the military and education. If this sounds like a strange mixture, it is. And yet the writing remains engaging, and the book does actually give some practical advice about how to incorporate gratitude, better listening skills, and activities which are orientated towards character strengths and accomplishment, into one’s life. Because of what appears to be a lack of editorial guidance, going into it with expectations to learn anything specific is likely to lead to disappointment. But if you pick it up, as I did, with an open mind no expectations, you may find quite a few provocative facts and perspectives.
]]>http://msls.net/2018/06/28/flourish-2011/feed/0The 100-Year Life (2016)http://msls.net/2018/06/20/the-100-year-life/
http://msls.net/2018/06/20/the-100-year-life/#respondWed, 20 Jun 2018 21:37:53 +0000http://msls.net/?p=685The basic argument of The 100-Year Life, by psychologist Lynda Gratton and economist Andrew Scott, is that not enough is being done to adapt to increasing longevity. After a quite interesting chapter on how drastically longevity has changed (the 1900 US expectancy was under 50!), the book sketches out in some detail archetypes from the baby boomer, gen X, and millennial generations, imagining how their lives might play out. As is probably obvious, the younger generations face increasingly insurmountable difficulties if they try to stick to the typical education/single career/retirement (three-stage life) that worked very well for the baby boomers, who could pick any career, stick to it, invest in virtually anything, and come out with a house, savings, and an irritating sense that they had somehow been rewarded for their wisdom and moral virtue.

That said, the section on “juvenescence,” whereby youthful features, behaviours, and even looks are persisting longer and longer in adults, was somewhat illuminating. This phenomenon is nothing new—it was also observed (in comparison with fin-de-siècle Vienna) by Stefan Zweig in his 1942 memoir, but here it gets a more scientific treatment. The description of how relationships will change, and how labour, childcare, and earnings may alternate between men and women was also interesting and probably true.

The problem with the book is that it feels like it is written either to guide public policy or to explain to the older generations why the younger generations are behaving differently and making different choices than they did, without giving much practical advice to the younger generations about what to do, besides platitudes about remaining flexible, gaining new skills, and being prepared for career changes—all of which, as the book observes, the younger generation are already doing. Obviously there’s huge uncertainty about what will happen in the coming decades, so it would be unfair to expect the book to provide particularly concrete guidance, but I think that much of what’s written here is probably already obvious to millennials. It might have some explanatory power for the older generations bewildered by the youth’s postponement of traditional milestones, however, as it argues that far from being irresponsible idlers, millennials are quite conservative planners, which I think is right.

The book is also quite repetitive—as I was actually told by the person who recommended it to me—and probably just reading a summary would have been enough, but I had hoped that it would provide some inspiration or hope in a time of my own personal uncertainty about life choices. The idea is interesting, and the book does maintain a hopeful tone, but for me it remained more a description of the lack of societal adaptation to changing conditions than a guide for how to navigate the new uncertainties.