Book Review

Current Issue

Dietrich Bonhoeffer: German theologian and resister

Cheap grace is preaching forgiveness without requiring repentance, baptism without church discipline, Communion without confession. … Cheap grace is grace without discipleship, grace without the cross, grace without Jesus Christ, living and incarnate.”

“The time is fulfilled for the German people of Hitler. It is because of Hitler that Christ, God the helper and redeemer, has become effective among us. … Hitler is the way of the Spirit and the will of God for the German people to enter the Church of Christ.” So, spoke German pastor Hermann Gruner. Another pastor put it more succinctly: “Christ has come to us through Adolph Hitler.”

So despondent had been the German people after the defeat of World War I and the subsequent economic depression that the charismatic Hitler appeared to be the nation’s answer to prayer—at least to most Germans. One exception was theologian Dietrich Bonhoeffer, who was determined not only to refute this idea but also to topple Hitler, even if it meant killing him. (more…)

The scene may feel familiar: an out-of-touch Republican establishment, bitter debates over gender roles, an angry populist rebellion. In “Divided We Stand,” Marjorie Spruill describes a polarized America that will be recognizable to any consumer of today’s news. Her story is set, though, in the 1970s and depicts, in the words of the book’s subtitle, “the battle over women’s rights and family values.”

Book Review

Past Issue

Death by Government

The line, “I’m from the government and I’m here to help you,” takes on new meaning after reading R.J. Rummel’s devastating Death By Government. This century, estimates the University of Hawaii political scientist, the State has killed almost 170 million people.

Death by Government is a compelling study of what the author calls democide—the intentional killing by governments through genocide, politicide, massacre, and terror. . . . A product of eight years of research by a distinguished political scientist, this is an unrivaled magnum opus with dozens of tables, figures, copious notes, and a massive bibliography. Essential reading for historians, political scientists, and readers interested in genocide.”

Book Review

Past Issue

Self-Control or State Control? You Decide

“Life is full of difficult choices. It requires courage to take responsibility for your own life and carve your own path. However, that’s also the best way to ensure a successful future and a prosperous, healthy society. If I were a young person pursuing a productive and rewarding life, I’d buy a copy of this book and study it carefully.” —John Mackey Co-Founder and Co-CEO, Whole Foods Market

Book Review

Past Issue

Hoodwinking the Nation

Most people in the US believe that our environment is getting dirtier, we are running out of natural resources, and population growth in the world is a burden and a threat. Why do the media report so much false bad news? Why do we believe it?

Julian Simon feels this is the work of, and goes after the, media-academic complex that he believed purposefully, organically, structurally, and ideologically purveys “false bad news,” specifically about the environment, resources, and population.

Book Review

Past Issue

Single Payer Socialized Medicine being Debated

“Introduction to Socialism,”

What is the difference between socialism and communism?

Socialism and communism are alike in that both are systems of production for use based on public ownership of the means of production and centralized planning. Socialism grows directly out of capitalism; it is the first form of the new society. Communism is a further development or “higher stage” of socialism.

From each according to his ability, to each according to his deeds (socialism). From each according to his ability, to each according to his needs (communism).

The socialist principle of distribution according to deeds— that is, for quality and quantity of work performed, is immediately possible and practical. On the other hand, the communist principle of distribution according to needs is not immediately possible and practical—it is an ultimate goal.

Obviously, before it can be achieved, production must reach undreamed of heights—to satisfy everyone’s needs there must be the greatest of plenty of everything. In addition, there must have developed a change in the attitude of people toward work—instead of working because they have to, people will work because they want to, both out of a sense of responsibility to society and because work satisfies a felt need in their own lives.

Socialism is the first step in the process of developing the productive forces to achieve abundance and changing the mental and spiritual outlook of the people. It is the necessary transition stage from capitalism to communism. (more…)

Shrinks: The Untold Story Of Psychiatry

As a history of a major medical specialty and a major public health problem, Dr. Jeffrey Lieberman’s Shrinks: The Untold Story of Psychiatry raises some disturbing questions. The field of psychiatry is based on a patient’s history and behavior, though histories can be creative, and behavior is typically interpreted in the eyes of the beholders, often with their own agendas. Where is the science? In an age of laboratory tests, functional MRIs, genetic markers and even pathology slides, none exist for psychiatric diagnoses. We look, we listen and we try out medications, often based on a “best guess.” The current interventions, such as electroshock and transcranial stimulation, may be effective, but the underlying science remains elusive.

Lieberman chairs the psychiatry department at Columbia University School of Medicine and is eminently qualified to write this story. He has watched and participated in the evolution of the field from a Freudian-based psychoanalytic specialty to an intervention-based practice that can now identify and treat some of the most impaired people in our society.

Shrinksis divided into three parts: one historical, one on therapeutic advances, and one about the battle between the traditional Freudian analysts and the new wave of psychiatrists.

The first part begins with a review of pseudoscience, fraud and greed. People who are aware that “something is not right” will often search for answers wherever they can, and when no good answers are forthcoming, they look for promises from people who are good salesmen with dubious credentials. Mesmer’s animal magnetism, Reich’s orgone accumulator, and Amen’s SPECT scans all offered promises of diagnosis and treatment that proved unfounded. The sole beneficiaries were the inventors themselves.

Sigmund Freud—a classically trained neurologist who studied under Jean-Martin Charcot, the founder of modern neurology—developed a theory of internal conflicts as the basis of mental illness. By slowly identifying these conflicts, much of the disturbed behavior could be treated or at least understood. This, of course, couldn’t apply to people with severe degrees of illness. They were left to the backup system, which consisted of mental institutions, i.e., the madhouses. Through the first half of the 20th century, those were the two major options. . .

A surgical technique to cure mental illness by cutting out a portion of the frontal lobe—the infamous Ice Pick surgery—was performed thousands of times, earning developer Egas Moniz a Nobel Prize in medicine. The result was a placid but hardly normal person.

Violent behavior—felt by some psychiatrists to be a focal neurologic condition related to temporal lobe epilepsy and amenable to localized surgery—was the basis of Michael Crichton’s novel The Terminal Man. Crichton took care of these patients while a medical student in Boston, and I inherited them when I arrived as the neurology resident. The procedure involved an implanted electrode that could be used as a micro-cautery to burn out the irritable focus. The patients became less violent but also passive and paranoid. Surgery certainly hasn’t given us an answer to mental illness. For a review of the thinking at that time, Violence and the Brain, by Vernon Mark and Frank Irvin, is still available.

Societies have always had outliers. The treatment of these people has varied, with some burned at the stake, some exiled, some shunted into religious orders and others merely locked up. The idea of locking up people for a behavioral abnormality, and not because of breaking a specific law, led to a strong antipsychiatry movement in the 1960s and 70s, led by a prominent psychiatrist named Thomas Szasz. His books The Myth of Mental Illness and Law, Liberty and Psychiatry were confrontational to the standard thinking at the time. As one of his students, I had to carefully distance myself from him when taking my oral boards in neurology and psychiatry. During the 1970s, psychiatry still had significant input from the psychoanalysts and traditionalists.

Psychiatry has evolved. With the development of medications for the most severe forms of mental illnesses, the need for locked institutions with up to 20,000 patients has passed. Of course, putting these people on the street under the care of community clinics, where they are largely unsupervised, means that a large portion of them no longer take their medications. Many homeless people would have been residents of the state hospitals in the past. Those hospitals were closed more for economic reasons than because better treatment was available in the community.

The historical section of Shrinks is a fascinating tale, though a little long on the details of Freud’s circle of professional friends and their often petty feuds. As most readers are not psychiatrists, these details seem unnecessary. More information as to what actually transpired in the madhouses during different eras might be more enlightening. “One Flew Over the Cuckoo’s Nest” is the only reference most of us have for those institutions. Having trained at one of the madhouses, Creedmoor State Hospital in New York, I could relate to psychiatric hospitals as alternative jails with no need for a judge or jury. The signature of a junior resident, whose first language was often not English, was enough to have you locked up for seven days. If two staff members signed off on it, you could be held for 30 days!

The second section of Shrinks—on advances in treatment over a relatively short period of time—is the core of the book. The ability to treat psychosis and severe depression has been lifesaving for a huge number of people. It has allowed people with significant mental illness to function independently and productively, which was unheard of in prior eras.

Modern psychiatry has identified post-traumatic stress disorder as a real illness and recognized severe depression as a potentially fatal disease. Diagnosis and treatment is the core of how medicine is practiced. Psychiatry is finally catching up. Modern medicines work more often than not, even if we don’t quite know how. They are safer than their predecessors and cover a wider range of conditions. This review of modern treatments is the most enlightening portion of Shrinks.

The battle between the traditionalists and the new wave of psychiatrists—exemplified in the adoption of the DSM coding system—takes up the last portion of the book. Lieberman was involved in the process, and as the president of the American Psychiatric Association, knew all the fine points being argued. Essentially discarding the entire psychoanalytic school took courage and a lot of political maneuvering. . .

In summary, the road from madhouses to modern treatment of mental illness has often been rocky, but it is well mapped in Shrinks by a psychiatrist who watched it evolve and helped push it forward. Lieberman doesn’t mind stepping on toes when needed, including a few of my old teachers and colleagues. ::

Book Review

Past Issue

CURRENT BOOKS | Lab Girl | By Hope Jahren, | Knopf, (2016)

Reviewed by Jeff Sugarman, MD

Anyone who has worked in a research laboratory knows the frustration of failure. Experiments go awry for good reasons—such as poor experimental design and improper hypothesis—and for trivial ones, such as power outage, measurement error or improper permits. Researchers, be they graduate students or professors, also know the thrill of discovery.

Reading Lab Girl, a memoir by Hope Jahren, PhD brought me back to the experience of knocking my head against the wall in the lab while preparing my own PhD. (We both finished our PhDs in 1995, hers in soil science at UC Berkeley, mine in molecular biology at UC San Diego.) Her words about the joy of having knowledge that no one else in the world has will resonate with anyone who has done experimental research.

The three sections of Lab Girl—“Roots and Leaves,” “Wood and Knots,” and “Flowers and Fruit”—comprise a memoir of Jahren’s life and scientific career, along with a lovely tour of plant biology. The memoir takes us from her childhood in a small Minnesota town, to her coming of age in graduate school at UC Berkeley, and finally to her academic success and acceptance.

By Jahren’s own account, she is a fanatically hard worker. She rightly acknowledges, but does not dwell on, the tediousness of lab work, observing that “like a lot a lab work that happens in the background, it wasn’t very interesting . . . but had to be done carefully and without error.” Much of the book traces the pain of her academic rejections at the hands of an impermeable good ol’ boys club in the male-dominated world of geobiology research. According to Jahren, the club initially rejected her ideas because of their novelty and her gender. In her view, the academic world can often be myopic and slow to change, and new ideas may be threatening or misunderstood.

Many chapters in Lab Girl start with lovely metaphors relating the plant world to the human realm. “The first real leaf is a new idea,” writes Jahren. “As soon as a seed is anchored, its priorities shift and it reflects all its energy toward stretching up. Its reserves have nearly run out . . . it has to work harder than everything above it.” This last observation could well refer to Jahren’s own struggles as a young scientist. . . .

The plant world for Jahren is full of beauty, and she has a gift for making this world more visible through her writing. Her power with words lies in finding accessible metaphors for the biological systems most of us witness daily but do not fully appreciate. ”A vine’s only weakness is its weakness,” she observes. “It desperately wants to grow as tall as a tree, but it doesn’t have the stiffness necessary to do it politely. Vines are not sinister; they are just hopelessly ambitious.” . . .

‘None of us knew then that this was the last war America would cleanly, conclusively win.

We thought it was the last war ever.’ –Theodore H. White

The Imperial Japanese Navy started the Pacific War on Dec. 7, 1941, with airstrikes on Hawaii. But it was the Japanese army that carried the battle forward and won victories as startling as the first attack had been surprising. U.S. and British strategy in the Far East was based on holding citadels and controlling shipping lanes. Within months of Pearl Harbor, Japanese forces seized Singapore and the Philippines in lightning campaigns. In both cases, the Allied commander tried to defend too widely and failed to prepare adequately for a siege. Each was overwhelmed by the demands and of modern warfare.

The man who lost Singapore, Gen. Arthur Percival, was resoundingly condemned in Britain—Churchill called the surrender “the worst disaster and largest capitulation in British history”—and spent three years in Japanese camps. By contrast, his American counterpart in the Philippines, Gen. Douglas MacArthur, received a presidential order to leave Corregidor, the besieged fortress that guarded Manila Bay, before its fall and was awarded the Medal of Honor. He is the subject of Walter R. Borneman’s “MacArthur at War: World War II in the Pacific,” which deftly tells the story of a controversial figure whose leadership was often flawed but who throughout the war was popularly regarded as the country’s greatest general.

The contrast between Percival’s and MacArthur’s fates was attributable to one thing. After more than two hard years of war, Britain could point to plenty of men who had served gallantly in the direst circumstances, from the beaches of Dunkirk to the skies above southern England. The U.S. could not. As the country reeled from defeat to defeat in the early days of 1942, the fact that MacArthur (1880-1964) was leading soldiers actually fighting the Japanese made him a hero. America needed a knight and “indulged in the hero worship of Douglas MacArthur,” writes Mr. Borneman: “In the first three months of 1942, Time magazine made mention of Admiral King seven times, General Marshall five times, Admiral Nimitz twice and General Eisenhower not at all, while news of MacArthur’s exploits appeared thirty-two times.”

Yet his leadership in the Philippines had been both incompetent and disgraceful. Having retired as U.S. Army chief of staff in 1935, he took up the baton of a field marshal in the Philippine army, charged by the Filipino president with building a military capable of defending the new commonwealth when it eventually achieved independence from the U.S. This was never more than a token force, despite MacArthur’s inflated claims. In July 1941, with war against Japan expected, he was recalled to duty in the U.S. Army and ordered to defend the Philippines with American troops. Army Chief of Staff George Marshall cabled MacArthur on Nov. 28, 1941: “The Secretary of War and I were highly pleased to receive your report that your command is ready for any eventuality.”

There were nine hours between the news of the attack on Pearl Harbor and the appearance of the first Japanese planes over the Philippines, but MacArthur failed to put his air forces on a war footing and suffered the destruction of his planes on the ground. In the ensuing months, he issued press releases touting his leadership but prepared little for the invasion to come. When it did, in late December 1941, he hid himself in Corregidor’s deep Malinta Tunnel complex, such that a man revered in World War I for almost suicidal battlefield courage became known as “Dugout Doug,” the object of a many-versed satire sung to the tune of “The Battle Hymn of the Republic.” (The refrain: “His troops go starving on.”) But Franklin Roosevelt wanted no martyrs—especially a man who had made plain his desire to be the Republican nominee for president. MacArthur escaped to Australia by PT boat. Even in defeat, he showed his self-absorption, happily accepting the Medal of Honor while squashing one for the man who remained to lead the final defenses of Bataan and Corregidor, Gen. Jonathan M. Wainwright.

Roosevelt sent MacArthur to Australia in the hope that his outsize presence would calm a country suddenly at war in its own backyard while its best divisions were fighting in the Middle East. The president’s decision to wage a “Germany first” war meant that some of the key battles in the Pacific were within the U.S. military, over scarce supplies and control of strategy. . . .

He was a lucky general, especially with his field commanders. The Army Air Corps’ George C. Kenney, in particular, was superb at making do with whatever planes could be scrounged and getting down to the job. He showed MacArthur that troops could be airlifted into battle, and his advocacy of “skipping bombs” across the water at enemy ships led to the victory at the Bismarck Sea, which ended Japan’s hopes of seizing the whole of New Guinea in March 1943. But MacArthur was luckiest of all in facing small and divided Japanese forces in the 21 months of fighting across New Guinea, from the Owen Stanleys to Buna, New Britain and Hollandia. It was a campaign of improvisation, which disguised MacArthur’s weak grasp of jungle warfare and tactics. . . .

MacArthur “wanted an attack,” writes Mr. Borneman, “in such force as would overwhelm the Japanese lines, reminiscent of the trench warfare of the last war.” But MacArthur’s pushing paid dividends: The swift strikes leapfrogging over Japanese positions that we know as “island-hopping” led to strongholds being isolated instead of assaulted and shortened the war. In every case, Kenney set up airfields and brought U.S. bombers that much closer to Japan. . . .

The question was the penultimate battle. Should it be for MacArthur’s Philippines or for Formosa, as Adm. King, overseeing operations from Washington, preferred? Which was the better staging ground for reducing the Japanese home islands? MacArthur rightly won the fight, and the landings at Leyte and Lingayen Gulf in late 1944 and early 1945 were among the largest Allied operations of the war. He was in his element as the conquering hero and lavished resources on liberating every stretch of the vast country. He won a further battle, too, wresting complete control in the Pacific for the invasions of Japan planned for November 1945 and March 1946. They were happily unnecessary after the atomic bombings in August. MacArthur took the Japanese surrender on the deck of the USS Missouri in Tokyo Bay on Sept. 2, 1945, with the gaunt figures of Percival and Wainwright standing directly behind. A more stark contrast of fortunes is hard to conjure.

The shelves groan under accounts of the colorful MacArthur’s life. Yet Mr. Borneman’s book is notable for the commendable fairness with which he treats a general who inspires extremes of adulation or antagonism. If there is little new in “MacArthur at War,” it is nonetheless a first-rate account of its subject and an excellent history of the less-known half of the American experience in the Pacific. Even so, it is hard to imagine the reader who comes away from the book thinking Roosevelt did anything but a disservice to his country by not sending MacArthur off to some soon-to-be-forgotten post after he escaped Corregidor.

Book Review

Past Issue

The Inarticulate Society

Notable & Quotable: Florence King

From a 1995 Journal review of the book

‘The Inarticulate Society: Eloquence and Culture in America.’

From “Dan Rather and Other Enemies of Civilization” in the July 31, 1995, Journal, a review of Tom Shachtman’s book “The Inarticulate Society: Eloquence and Culture in America” by writer Florence King, who died Jan. 6 at age 80: WSJ | Jan 14, 2016

The book’s pièce de résistance is Mr. Shachtman’s sardonic tracing of the decline and fall of TV news, and how it has destroyed eloquence.

On Aug. 29, 1963, the “CBS NewsWithWalter Cronkite” aired its first half-hour edition. Everyone sounds like an Oxford don, speaking in complete sentences with so many dependent clauses that they have to take a breath before the end. There is almost no action footage or graphics, and the uncreative commercials, mostly written testimonials, always parse.

The edition of Oct. 27, 1972, was Mr. Cronkite’s first lengthy perspective on Watergate. We see more graphics, visual aids and film; one-breath sentences now prevail, but they are still complete, except for the serpent in the garden, Dan Rather, who reports from the White House: “Nine vetoes today, more promised tomorrow.” . . .

On Nov. 9, 1989, “The CBS News With Dan Rather” features the collapse of German communism and English metaphor. “The Berlin Wall is still standing, but it doesn’t stand for much,” Dan begins, explaining against a backdrop of busy visuals that the world is “racing to stay ahead of the curve of history.” Cut toGeorge Bush, who says, “I’m not going to hypothecate that it may—anything that goes too fast . . .” Then back to Dan for the final word: “The Berlin Wall is obsolete tonight.” The commercials are frantic, and the show ends with an invitation to join Dan later on “48 Hours” for a discussion of sex and teenagers. . .

His solutions are impossibly idealistic—hire only well-spoken baby sitters, give networks tax writeoffs for cultural programs that do not get high Nielsen ratings—but one at least filled me with venomous glee: “Among the first orders of business ought to be the abolition of teachers’ colleges and teaching degrees.”

Book Review

Past Issue

When Breath Becomes Air By Paul Kalanithi, MD

Review: Paul Kalanithi’s When Breath Becomes Air is an exquisitely moving exploration of mortality

SANDRA MARTIN

Special to The Globe and Mail

Paul Kalanithi’s memoir,When Breath Becomes Air, begins with a wallop. A neurosurgical resident in his final year of training, he’s examining the CT scans of a patient with Stage 4 lung cancer. He’s seen many such scans. This time, though, he is the patient, not the doctor. Wearing a hospital gown, tethered to an IV pole, Kalanithi reads his own death sentence by scrolling through the images on the computer screen a nurse has left in his room. . .

An insider in the mysterious and bewildering realm of hope and fear that represents modern health care, Kalanithi is in a position to enlighten us about how doctors die. Do they know something we don’t? Do they get access to earlier treatment, experimental drugs, better pain medication than the rest of us?

Spoiler alert: Kalanithi expires the way most people do: reluctantly, after several debilitating rounds of failed treatment, in a hospital bed, monitors turned off, drugged into unconsciousness until the morphine mercifully suppresses his compromised breathing . . .

I first learned of Kalanithi and his fate in January, 2014, when he wrote an opinion piece in The New York Times under the provocative headline, “How Long Have I Got Left?”

Here’s how he described his dilemma:

“The path forward would seem obvious, if only I knew how many months or years I had left. Tell me three months, I’d just spend time with family. Tell me one year, I’d have a plan (write that book). Give me 10 years, I’d get back to treating diseases. The pedestrian truth that you live one day at a time didn’t help: What was I supposed to do with that day? My oncologist would say only: ‘I can’t tell you a time. You’ve got to find what matters most to you.’”

When Breath Becomes Airis the story of how Kalanithi learned to stop planning his future and to live in the present until he died, in March, 2015, surrounded by his family. He was 37 years old. Afterward, his wife, Lucy Kalanithi, picked up the keyboard and wrote an epilogue about her husband’s illness, the birth of their daughter Cady and her own grief as she reassembled the shattered kaleidoscopic pieces of their life together and attempted to move on by herself. All of this is exquisitely moving and you are a more stoic person than I if you can read it without splashing a few tears as you turn the pages. But that is not why I am urging you to read this book.

Kalanithi was born into a South Asian immigrant family. Many of them were doctors – his father, his mother, an uncle – but he never intended to be one himself. He knew medicine only by its deficits. He writes about “the absence of a father growing up, one who went to work before dawn and returned in the dark to a plate of reheated dinner.”

When Kalanithi was 10, his father, who was anxious about the cost of living and the price of educating his three sons at elite colleges, moved the family from an affluent suburb north of Manhattan, N.Y., to Kingman, Ariz. Kalanithi describes it as a small town “in a desert valley ringed by two mountain ranges, known primarily to the outside world as a place to get gas en route to somewhere else.”. . .

An excellent student, Paul Kalanithi, the middle son, intended to be a writer. Along the way, he became fascinated by human biology, after reading a junk novel given to him by an older girlfriend. It impressed upon him the notion that the brain was a biological organ that enabled the mind to make sense of the world and, among other things, appreciate the meaning of literature, his first love.

Consequently, he studied both literature and biology at Stanford University, before earning a graduate degree in history and philosophy of science and medicine at Cambridge and then going to medical school at Yale. He graduatedcum laudewith a stack of prizes, went back across the country to do a residency in neurological surgery at Stanford, published a series of heralded research papers as a postdoctoral fellow and was weighing exciting and lucrative job offers when the fickle finger of fate picked him out as its next victim. . . .

When Breath Becomes Airis Kalanithi’s first and last book. As readers, we have been deprived of the chance to add his name to the ranks of doctor writers, including Sherwin Nuland, Oliver Sacks, Abraham Verghese (who contributes an eloquent foreword) and Atul Gawande. These physicians – surprisingly all male – have the ambition, the scientific knowledge and the literary talent to invite us into their clinical world, reveal its secrets and, in so doing, enlarge our understanding and enhance our perceptions of what it is to be human.

Book Review

Past Issue

House Of God: San Mateo County Physician

Reviewed by Michael Norris, MD, President

I recently re-read The House of God, a satirical novel published in 1978. It details the intern experience of a young doctor at the House of God, patterned after Boston’s Beth Israel Hos­pital. The author is Samuel Shem, pen-name of psychiatrist Dr. Stephen Bergman. Like the novel’s protagonist, Roy Basch, the author spent his internship at the House of God.

Roy is a graduate of the Best Medical School (BMS=Harvard) and has landed a prized slot at the prestigious institution of healing. There he meets his mentor, the Fat Man, a medical resident who preaches the Rules of the House of God. There are 13 rules, including number 6: “There is no body cavity that cannot be reached with a #14 needle and a strong right arm.”

As Roy proceeds though his year, he also learns Rule number 13: “The delivery of medical care is to do as much nothing as possible,” for he discovers that many of his elderly patients actually do better without numerous tests and interventions, while his younger patients will often die despite all his efforts.

The novel depicts the sleep deprivation, the housestaff camaraderie, the relationships with the nursing staff (often sexual). There are passages that are laugh-out-loud funny, yet others portraying housestaff isolation, depression, and even suicide.

I first read this book in the late 1970s, as a general surgical resident. All of my fellow resi­dents read it, and we would often quote the wisdom of the Fat Man. The broad satire seemed, at the time, to miss the truth by only a small amount. Bergman wrote this book in part to protest the inhumanity of the graduate medical education process of the time. As Chuck, another intern, wonders, “How can we care for patients, man, if’n nobody cares for us?”

The interns see the flaws in the system, in the delivery of medical care, and try to change it, ultimately by opting out. Protagonist Roy Basch, like Bergman, goes into psychiatry.

Many young doctors loved The House of God because it resonated with their own experienc­es during their internship training, but many senior physicians found it offensive. A movie was made of the novel in 1984, starring Tim Matheson as Roy Basch, but it was considered too dark and never released in theaters, though it was shown on HBO a few times, mostly as filler in non-peak hours.

Re-reading the book after 36 years made me a little nostalgic for the young me, those days and nights spent in the hospital. Yet I can now see, as an “older” doc, how it can also be ac­cused of casting the medical profession in a negative light.

Book Review

Past Issue

Tough Medicine

In the fall of 1963, not long after Vincent T. DeVita, Jr., joined the National Cancer Institute as a clinical associate, he and his wife were invited to a co-worker’s party. At the door, one of the institute’s most brilliant researchers, Emil Freireich, presented them with overflowing Martinis. The head of the medical branch, Tom Frei, strode across the room with a lab technician flung over his shoulder, legs kicking and her skirt over her head. DeVita, shocked, tried to hide in a corner. But some time later the N.C.I.’s clinical director, Nathaniel Berlin, frantically waved him over. Freireich, six feet four and built like a lineman, had passed out in the bathtub. Berlin needed help moving him. “Together, we pulled him up, threw his arms over our shoulders, and dragged him out through the party,” DeVita writes, in his memoir, “The Death of Cancer” (Sarah Crichton Books). “Out front, Freireich wife, Deanie, sat behind the wheel of their car. We tossed Freireich in the backseat and slammed the door.”

Half a century ago, the N.C.I. was a very different place. It was dingy and underfunded—a fraction of its current size—and home to a raw and unruly medical staff. The orthodoxy of the time was that cancer was a death sentence: the tumor could be treated with surgery or radiation, in order to buy some time, and the patient’s inevitable decline could be eased through medicine, and that was it. At the N.C.I., however, an insurgent group led by Frei and Freireich believed that if cancer drugs were used in extremely large doses, and in multiple combinations and repeated cycles, the cancer could be beaten. “I wasn’t sure if these scientists were maniacs or geniuses,” DeVita writes. But, as he worked with Freireich on the N.C.I.’s childhood-leukemia ward—and saw the fruits of the first experiments using combination chemotherapy—he became a convert.

DeVita decided to try the same strategy on another seemingly hopeless cause, Hodgkin’s lymphoma, a cancer that begins as a solid tumor in the lymph nodes and steadily spreads throughout the body. He teamed up with a fellow-associate named Jack Moxley. Over a few beers one night, at Au Pied de Cochon in Georgetown, the two sketched out a protocol, based loosely on what Frei and Freireich were doing with leukemia. Given the ability of cancer cells to adapt and mutate in the face of threats, they figured they needed four drugs, each effective against Hodgkin’s in its own way, so that whatever cells survived one wave had a chance of being killed by the next. They also had to be careful how frequently they gave the drugs: doses needed to be high enough to wipe out the cancer cells but not so high that they killed the patient. After several months, they settled on a regimen called MOMP: three eleven-day rounds of nitrogen mustard, Oncovin (a brand of vincristine), methotrexate, and prednisone, interspersed with ten-day recovery cycles.

“The side effects were almost immediate,” DeVita writes:

The sound of vomiting could be heard along the hallway. Night after night, Moxley and I paced outside the rooms of our patients, fearful of what might happen. Over the weeks that followed, they lost weight and grew listless, and their platelet counts sank lower and lower to dangerous levels.

Then came the surprise. Twelve of the fourteen patients in the initial trial went into remission—and nine stayed there as the months passed. In most cases, the tumors disappeared entirely, something that had never before been seen in the treatment of solid tumors. In the spring of 1965, DeVita went to Philadelphia to present the results to the annual meeting of the American Association for Cancer Research. He stood up before the crowd and ran triumphantly through the data: “‘Our patients were, therefore,’ I said, savoring the dramatic conclusion, ‘in complete remission.’”

What happened? An illustrious cancer expert named David Karnofsky made a narrow point about the appropriateness of the term “complete remission.” After that, nothing: “There were a few perfunctory questions about the severity of the side effects. But that was it.” History had been made in the world of cancer treatment, and no one seemed to care.

Vince DeVita served as the head of the National Cancer Institute from 1980 to 1988. He went on to serve as the physician-in-chief of the Memorial Sloan Kettering Cancer Center, in New York, and then ran the Yale Cancer Center, in New Haven. For the past half century, he has been at the forefront of the fight against one of the world’s most feared diseases, and in “The Death of Cancer” he has written an extraordinary chronicle. DeVita’s book is nothing like Siddhartha Mukherjee’s magisterial “The Emperor of All Maladies.” Mukherjee wrote a social and scientific biography of the disease. DeVita, as befits someone who spent a career at the helm of various medical bureaucracies, has written an institutional history of the war on cancer. His interest is in how the various factions and constituencies involved in that effort work together—and his conclusions are deeply unsettling.

When his first go-round as a clinical associate at the N.C.I. was up, DeVita took a post as a resident at Yale. At what was supposed to be a world-class hospital, he discovered that the standard of care for many cancers was woefully backward. Freireich had taught DeVita to treat Pseudomonas meningitis in leukemia patients by injecting an antibiotic directly into the spinal column—even though the drug’s label warned against that method of administration. That was the only way, Freireich believed, to get the drug past the blood-brain barrier. At Yale, DeVita writes, “you just didn’t do that kind of thing. As a result, I watched leukemic patients die.” Leukemia patients also sometimes came down with lobar pneumonia. Conventional wisdom held that that ought to be treated with antibiotics. But N.C.I. researchers had figured out that the disease was actually a fungal infection, and had to be treated with a different class of drug. “When I saw this condition in patients with leukemia and pointed it out to the chief of infectious diseases at Yale, he didn’t believe me—even when the lab tests proved my point,” DeVita continues. More patients died. Leukemia patients on chemotherapy needed platelets for blood transfusions. But DeVita’s superiors at Yale insisted there was no evidence that transfusions made a difference, despite the fact that Freireich had already proved that they did. “Ergo, at Yale,” DeVita says, “I watched patients bleed to death.”

Later, when DeVita and his fellow N.C.I. researcher George Canellos wanted to test a promising combination-chemotherapy treatment for advanced breast cancer, they had to do their trial overseas, because they couldn’t win the coöperation of surgeons at either of the major American cancer centers, Memorial Sloan Kettering or M. D. Anderson. When the cancer researcher Bernard Fisher did a study showing that there was no difference in outcome between radical mastectomies and the far less invasive lumpectomies, he called DeVita in distress. He couldn’t get the study published. “Breast surgeons made their living doing radical or total mastectomies, and they did not want to hear that that was no longer necessary,” DeVita writes. “Fisher had found it difficult to get patients referred to his study, in fact, because of this resistance.” The surgeons at Memorial Sloan Kettering Cancer Center were so stubborn that they went on disfiguring their patients with radical mastectomies for years after Fisher’s data had shown the procedure to be unnecessary. “The Death of Cancer” is an angry book, in which one of the critical figures in twentieth-century oncology unloads a lifetime of frustration with the obduracy and closed-mindedness of his profession. DeVita concludes, “There are incredibly promising therapies out there. If used to their fullest potential for all patients, I believe we could cure an additional 100,000 patients a year.” He is not the first to point out the shortcomings of clinical practice, of course. What sets “The Death of Cancer” apart is what he proposes to do about it. . .

Combination chemotherapy is a delicate balancing act. Cancer drugs are typically so toxic that they can be given only in short bursts, so that patients can regain their strength. If the breaks are too long, though, the cancer comes roaring back. In the first trial, they had simply followed the schedule that Freireich used in treating leukemia. Hodgkin’s cells, however, were different. They divided more slowly—and, since cancer cells are most vulnerable when they are dividing, that suggested that the Hodgkin’s schedule needed to be a lot longer.

So MOMP became MOPP: two full doses of nitrogen mustard and vincristine on the first and the eighth days, and daily doses of procarbazine and prednisone for fourteen days, followed by two weeks of rest. Since only twenty per cent of Hodgkin’s cells would divide during the course of that cycle, the regimen would have to be repeated at least six times. A second trial was launched, and the outcome was unequivocal: the regimen had beaten the disease. . .

DeVita was told that his data must be wrong.

Baffled, he asked one of the hospital’s leading oncologists, Barney Clarkson, to explain exactly how he was administering the MOPP protocol. Clarkson answered that he and his colleagues had decided to swap the nitrogen mustard in DaVita’s formula for a drug called thiotepa. This was a compound they had developed in-house at Memorial Sloan Kettering and felt partial to. So MOPP was now TOPP. DeVita writes:

They’d also cut the dose of procarbazine in half, because it made patients nauseous. And they’d reduced the dose of vincristine drastically because of the risk of nerve damage. They’d also added, at a minimum, an extra two weeks between cycles so that patients would have fully recovered from the toxic effects of the prior dose before they got the next. They gave no thought to the fact that the tumor would have been back on its feet by then, too, apparently.

These alterations had not been tested or formally compared with DeVita’s original formula. They were simply what the oncologists at Memorial Sloan Kettering felt made more sense. After an hour, DeVita had had enough:

“Why in God’s name have you done this?” he asked.

A voice piped up from the audience. “Well, Vince, most of our patients come to us on the subway, and we don’t want them to vomit on the way home.”

Here were physicians at one of the world’s greatest cancer hospitals denying their patients a potentially life-saving treatment because their way felt better. . .

The best innovations are sometimes slow to make their way into everyday medical practice. Hence the sustained push, in recent years, toward standardizing treatments. If doctors aren’t following “best practices,” it seems logical that we should write up a script describing what those best practices are and compel them to follow it.

But here “The Death of Cancer” takes an unexpected turn. DeVita doesn’t think his experience with the stubborn physicians at Memorial Sloan Kettering or at Yale justifies greater standardization. He is wary of too many scripts and guidelines. What made the extraordinary progress against cancer at the N.C.I. during the nineteen-sixties and seventies possible, in his view, was the absence of rules. . .

Clinical progress against a disease as wily and dimly understood as cancer, DeVita argues, happens when doctors have the freedom to try unorthodox things—and he worries that we have lost sight of that fact. . .

“Over the years, we’ve gained more tools for treating cancer, but the old ability to be flexible and adapt has disappeared,” DeVita writes:

Guidelines are backwards looking. With cancer, things change too rapidly for doctors to be able to rely on yesterday’s guidelines for long. These guidelines need to be updated frequently, and they rarely are, because this takes time and money. . . . Reliance on such standards inhibits doctors from trying something new. . .

Here we have a paradox. The breakthroughs made at the N.C.I. in the nineteen-sixties and seventies were the product of a freewheeling intellectual climate. But that same freewheeling climate is what made it possible for the stubborn doctors at Memorial Sloan Kettering to concoct their non-cure. The social conditions that birthed a new idea in one place impeded the spread of that same idea in another. People who push for greater innovation in the marketplace often naïvely assume that what is good for the innovator is also, down the line, good for the diffusion of their ideas. And people worried about diffusion often position themselves as the friends of innovation, as if a system that does well at spreading good ideas necessarily makes it easier to come up with good ideas. The implication of “The Death of Cancer” is, on the contrary, that innovation and diffusion can sometimes conflict. . .

The angriest chapter of “The Death of Cancer” is devoted to the Food and Drug Administration, because DeVita believes that it has fundamentally misunderstood the trade-off between diffusion and innovation. The agency wants all new drugs to be shown to be safe and efficacious, to be as good as or better than existing therapies (or a placebo) in a randomized experiment involving the largest possible number of patients. For example, the F.D.A. might ask that patients getting an experimental treatment have better long-term survival rates than those receiving drug treatments already in use. The F.D.A. is the country’s diffusion gatekeeper: its primary goal is to make sure that good drugs get a gold star and bad drugs never make it to market.

DeVita reminds us, though, that this gatekeeping can hinder progress. A given tumor, for instance, can rarely be stopped with a single drug. Cancer is like a door with three locks, each of which requires a different key. Suppose you came up with a drug that painlessly opened the first of those three locks. That drug would be a breakthrough. But it can’t cure anything on its own. So how do you get it through a trial that requires proof of efficacy—especially if you don’t yet know what the right keys for the two remaining locks are? Since cancer comes in a dizzying variety of types and subtypes, each with its own molecular profile, we want researchers to be free to experiment with different combinations of keys. Instead, DeVita argues, the F.D.A. has spent the past two decades pushing cancer medicine in the opposite direction. He continues:

Drugs are now approved not for a specific cancer or for general use in a variety of cancers but for a specific stage of a specific cancer and specifically after and only after patients have had all current treatments, which are listed drug by drug, and the treatments have all failed. Doctors risk F.D.A. censure if they use an approved drug under any other circumstances, and patients are penalized because insurance companies won’t pay for treatments not approved by the F.D.A.

The vital insight gained by using an approved drug in a different way for a different tumor has been lost. . .

When DeVita faced the naysayers at Memorial Sloan Kettering, who worried about their Hodgkin’s patients on the subway ride home, he informed them curtly, “If you told those patients that the choice was between being cured and vomiting, or not vomiting and dying, don’t you think they might have opted to take a cab?” This is how diffusion happens in a world without a diffusion gatekeeper. But how many doctors are capable of that kind of hand-to-hand combat? Life on the innovation end of the continuum is volatile, fractious, and personal—less a genteel cocktail party, governed benignly by bureaucratic fiat, than the raucous bender where your boss passes out in a bathtub. When DeVita returned to Memorial Sloan Kettering years later, as the physician-in-chief, the hospital got better. But DeVita didn’t last, which will scarcely come as a surprise to anyone who has read his book. “The problem with Vince,” the hospital’s president reportedly said, in announcing his departure, “is that he wants to cure cancer.”

The Theater

Past Issue

Let There Be Love

Let There Be Loveis an intimate and interesting family drama by Kwame Kwei-Armah, one of Britain’s most distinguished contemporary playwrights. Alfred, a cantankerous and aging West Indian immigrant living in London, has managed to alienate all those around him—including his equally headstrong lesbian daughter, with whom he rarely sees eye to eye. When an idealistic young Polish caregiver, new to the country, is assigned to look after him, he experiences a powerful reckoning with his past. Filled with the sumptuous jazz standards that pour forth from Alfred’s beloved gramophone and featuring a tour-de-force performance from stage and screen star Carl Lumbly,Let There Be Loveexplores the unrelenting grip of memory and regret as the Polish caregiver arranges for him to visit his former wife with their daughter who feels estranged from both and forgiveness that can happen. Visit A.C.T . . .

He was feeling good, was rather vigorous, on his feet, dancing with his caregiver after he drank the suicide potion when he welcomes death visualizing the life beyond. The author treats this as Alfred experiencing new possibilities as he collapses in the chair having died.

This support of the hemlock society is very incongruous. One does not enter eternal life by killing oneself. For believers, this is a direct highway to the furnace. Alfred was quite functional, in body and brain, fully alert, without pain. A man of God would have sung praises, as he ascended to the Pearly Gates to meet his Maker.

Unfortunately, Alfred is sliding down the slippery slope to a much warmer place.

Book & Cinematic Reviews

Past Issue

Extreme Medicine By Dr Kevin Fong

Miraculous Medical Tales by Brien A. Seeley, MD, Sonoma Medicine

My father-in-law, the late Dr. Lyle Powell Jr., taught me the axiom that “to best understand something complex, one should examine its extremes.” This examination not only clarifies the thing’s limits; it reveals what inner workings impose those limits. That revelation is precisely the result when this axiom is applied to human physiology by Dr. Kevin Fong in his engaging new book, Extreme Medicine.

Fong’s book provides the reader with a fascinating tour of the emergence of modern medicine from a brutish past in which war, disaster, epidemics and the perils of exploration all pressed pioneering physicians to try new things. Fong is a master storyteller, and his prose delivers a captivating and punchy mix that is part Discovery Channel and part Rod Serling. His writing style bespeaks of British formality, impeccably correct and at times reminding one of a dialogue in Downton Abbey. His medical training in London gives much of the book the tone of a lecture by an emeritus professor. Yet his tales have a dramatic and personalized intimacy, both from his own experiences and those of real historical figures. These stories are thrilling. They put the reader right there, as if personally confronting the acute, life-threatening medical problem and having to make the daring decisions about what should be done.

Extreme Medicine will reward both lay readers and those in the medical profession. For physicians, this book will summon again the awe and “aha” that we felt at the new insights into human physiology that were bestowed upon us during our medical training. When Fong recounts the frantic, midnight rush to respond to a code blue crash-cart experience, he instantly transports the physician reader back to those breathtaking codes that we attended as interns, where some of those insights were etched into us.

Time and again, Fong removes the reader from the immediate crisis to the comfort of a crystal-clear, academic retelling of the underlying physiology that pertains. This technique is quite effective, and it intensifies our appreciation for the importance of the miraculous cellular and molecular workings of human physiology. As such, this book will improve every physician who reads it. . .

Book & Cinematic Reviews

Past Issue

Being Mortal

Being Mortal: Medicine and What Matters in the End, Atul Gawande, MD, Metropolitan, (2014).

Rick Flinders, MD, and Jessica Flinders, FNP

Sonoma Medicine | The Magazine of the Sonoma County Medical Association

Atul Gawande has done it again. With his writer’s craft, he has directed a surgeon’s precision at yet another of the great maladies of his profession. Writing chiefly from The New Yorker since 1988, he has dissected, among other topics, the systemic malady of medical errors (Complications, 2002) and the high cost of medical care (“The Cost Conundrum,” 2009). Now, in Being Mortal, he tackles the burden of owning up to the often delusional stubbornness of our cultural persistence in denying the reality of death. Prolonging, at any cost, life of often dubious and miserable quality has become the modus operandi of a largely undirected and incoherent medical system. At the heart of the discussion is our own mortality. As the contemporary American poet Mary Oliver concludes in her memorable poem “On Blackwater Pond”:

To live in this world
you must be able
to do three things:
to love what is mortal;
to hold it
against your bones knowing
your own life depends on it;
and, when the time comes to let it go,
to let it go.

Both of us (Rick and Jessica) practice in settings with considerable experience of the elderly. In addition, we are both old enough to take mortality personally: our patients’, our own, our spouses, our parents, even our children. Gawande writes: “The waning days of our lives are given over to treatments that addle our brains and sap our bodies for a sliver’s chance of benefit. They are spent in institutions—nursing homes and ICUs—where regimented and anonymous routines cut us off from all the things that matter to us in life. Our reluctance to honestly examine the experience of aging has increased the harm we inflict on people and denied them the basic comforts they most need. Lacking a coherent view of how people might live successfully to the very end, we have allowed our fates to be controlled by the imperatives of medicine, technology and strangers.”

In Being Mortal, Gawande carefully and compellingly lays out the case for a more enlightened and compassionate approach to the care of our frail elderly. Like many in our profession, he is appalled by the increasingly excessive treatment of the terminally ill. Who among us has not witnessed our own best efforts to prolong life often succeed only in prolonging dying? We’ve all heard, from more than one patient: “I ain’t afraid of death, Doc. It’s the dying that scares me.”

Gawande is acutely aware of the danger of a physician, sworn to preserve life, writing about the inevitability of decline and death. “Mortality can be a treacherous subject,” he writes, “No matter how carefully you frame it, people are going to accuse you of fostering a society prepared to sacrifice its sick and aging.”

We all remember how the legitimate debate for incorporating the benefits of hospice care into the Affordable Care Act degenerated into the specter of “death panels.” But what if the sick and the aged are already being sacrificed, Gawande asks, “as victims of our collusive refusal to accept the inexorability of our own life cycle?”

A significant portion of Being Mortal examines evidence for better approaches to dying, with better outcomes, that Gawande says are “largely ignored, waiting to be recognized.” As in all his written work, Gawande’s timing and methodology are impeccable. He blends the facts of our aging epidemiology with stories of our elders’ fear of dying and their experience of getting old and sick. In the foreground is the story of his own father, also a physician, written in a direct and delicate voice. Gawande never stoops to melodrama, yet he tells the tale we are all living through as a people and a nation. According to him, the institutional care of our elderly is not only Dickensian in tone, it occurs in settings and inside walls that would fit into the chapters of the coldest and darkest Dickens.

While Gawande’s narrative is informative on the demographics and experience of aging, he also directs us to look past the external circumstances of illness. Listen and learn from the patient. In perhaps the best chapter of Being Mortal, Gawande writes about the “hard conversations” we have with patients. One of these he has come to call the “breakpoint discussion” and urges us to ask these four questions:

• What is your life like right now?
• What are your fears?
• What are your goals?
• What trade-offs are you willing to make in order to achieve your goals?

Gawande’s father, for example, was willing to live as a quadriplegic as long as he could eat chocolate ice cream and watch TV. Closer to home, an acquaintance of ours recently said that he would accept almost any quality of life as long as he was free of severe pain, could sustain his own nutrition, and both recognize and communicate with his loved ones. The answers for each person are unique, personal and unpredictable—and they change with time and experience. While the answers are important in themselves, they also provide a framework for the conversation and process to continue. If death is the last illness we still don’t speak about, the “breakpoint discussion” is a way of opening the conversation.

Rick recently attended his 50th high school reunion. It was sweet and sobering. Sweet to see old friends, but in a sidebar on the invitation was a somber catalog of nearly 80 others who would not be attending the reunion—not because of travel distance, but because of the inconvenient truth of death. Among the old friends who did attend, the topic of conversation was not the prospect of dying, but rather the fear of going bankrupt from medical expenses trying to prevent death—as if the spiritual and existential issues surrounding our mortality weren’t enough.

Just as modern medicine medicalized childbirth a half century ago, it has now medicalized death and dying. These processes have been abducted from the cultural and social context that centuries of civilization developed for family and community.

Dr. Gayle Stephens, the recently deceased founding father of family medicine, feared becoming a patient more than dying. He asked:

“Must death continue to be a tawdry, privatized, sanitized farce played out in institutional settings like so many crucifixions? Cannot we as physicians, who collectively have contributed so much to the horror that makes everybody yearn for a quick and painless death, give some leadership in restoring death to the dignity of its communal roots, and help make it mean something again, not only for the dying but also for the living?”

This is strong language from a medical elder who earned, over a lifetime of practice and teaching, the right and credibility to be heard. At the age of 85, he had to negotiate, vigorously and contentiously, with his university doctors to be released to hospice care and go home to the care of his family.

The Declaration of Independence speaks of the “unalienable rights” of life, liberty and the pursuit of happiness. Given our current circumstances, as so compellingly depicted by Gawande, we may need to add to these the right to be allowed a natural death. ::

Dr. Flinders, who teaches hospital medicine at the Santa Rosa Family Medicine Residency, serves on the SCMA Editorial Board. Ms. Flinders is a family nurse practitioner at Northern California Medical Associates in Petaluma.

Book & Cinematic Reviews

Past Issue

Stanley Fish’s Postmodern Take On Academic Freedom

Whatever their ostensible subjects, Stanley Fish’s books tend to be about Stanley Fish. His new one, Versions of Academic Freedom, extends the conceit.

Which is not to say that the book is only a “Version of Stanley Fish.” It is also a succinct, well-informed, and often elegant essay. Fish’s great talent is compression. In this case he reduces the overgrown jungle of debate about academic freedom in America’s colleges and universities to a lucid list of five alternative positions:

1. The “It’s just a job” school

2. The “For the common good” school

3. The “Academic exceptionalism or uncommon beings” school

4. The “Academic freedom as critique” school

5. The “Academic freedom as revolution” school

These are “ideal types” in Max Weber’s phrase. Fish no sooner names them than he admits that in the real world the lines blur and people are inconsistent. Nonetheless, the five-fold typology provides both a map of the larger territory and a path to specific destinations.

It could use up the better part of a review just to explain the five alternatives, so at the risk of further compressing Fish’s compressions, I will leave it at this. “It’s just a job” treats academic scholars as professional workers who, because they are hired to advance knowledge, need a certain amount of workplace latitude to do their jobs. This is the form of “academic freedom” that Fish says he upholds. His position on this is consistent with his 2008 book, Save the World on Your Own Time, which I reviewed on my own time as “Night Makes Right.”

“For the Common Good” refers to arguments that granting academic freedom to professors within their disciplines contributes to self-government by militating against facile enthusiasms that can lead to the tyranny of public opinion. “Academic exceptionalism” extols academic freedom by treating professors as people set apart from everyone else by their unusual talents and therefore deserving of privileges that are denied to ordinary people. “Academic freedom as critique” projects the freedom of the professors beyond their disciplines to the rest of the social order. “Academic freedom” in this view is almost synonymous with dissent. “Academic freedom as revolution” holds that the whole purpose of education is to advance radical reform of society.

Fish’s typology is impressive, but the moment he applies it, it breaks down. That moment comes straightaway with the case of U.C. Santa Barbara sociology professor William Robinson who got into hot water by sending an email to his students “comparing Israelis to Nazis and asserting that Martin Luther King would have stood with the Palestinians had he been alive.” Fish proceeds by showing that the proponents of each rationale for academic freedom other than his own (“It’s just a job”) would have (or actually did) side with Robinson. Fish’s position, by contrast, is that Robinson erred because his explicit goal was to advance a political cause rather than to stimulate “vigorous discussion” of an academic issue. For Fish, the key question is motive. He quotes Robinson speaking to the Seventh Annual International Al-Awda Convention in 2009 explaining that he acted out of “growing horror” at the “siege of Gaza,” and he intended, plainly, to advance his own judgments to his students. In Fish’s view, this crosses the line. If the conclusion is “ordained” before the inquiry begins, the inquiry is “not academic” and does not enjoy the protection of academic freedom.

In this case, Fish has set himself up rather nicely. Every other “version” of academic freedom called to the witness stand has given the anti-Semite professor an alibi. Only Fish can provide unambiguous and principled grounds to say why Robinson abused his privileges as a professor.

But surely something is missing from Fish’s account. Might it be this? Academic freedom concerns the pursuit of the truth. There are numerous situations in which the truth is not well-established, or established views are open to reasonable objections, or there are well-argued but mutually incompatible views. A deep reason why we want academic freedom is to create a context in which reason and evidence on all sides of a contentious issue can be brought forward for thoughtful consideration and debate. Without such debate, knowledge settles into stultifying orthodoxy. Because “settled opinion” or orthodoxy is the natural state of opinion on most things, we have to take care to create some special conditions where people are encouraged to look further, question assumptions, and seek evidence that might not otherwise come into focus. Academic freedom is what we call that special condition. It is undermined when someone purloins its name not to seek the truth but to propound an opinion or enforce an orthodoxy. . .

Moral Tribes: Emotion, Reason And The Gap Between Us And Them, By Joshua Greene

Think about your mom and dad. Remember their unconditional loving-kindness, their memorable storytelling and their exemplariness. Tenderly whisper your thanks to them for your precious morality, your compass for navigating life. Your morality runs deep, etched into your heart. It is the cornerstone of who you are and is used in decision-making every day. It evolved from the sum of your life experiences as a perfect fit for your culture, your tribe. Though it is not infallible, it provides your first, fastest and usually best assessment of what to do. Even though it was formed from sayings as simple as “because Mommy said so,” your morality is your surest tool for achieving happiness.

The above paragraph states the basic premise on which Harvard brain scientist and sociobiologist Joshua Greene builds a more expansive view of morality in his new book, Moral Tribes. Greene views morality as our human capacity for solving five basic types of social conflict: “me vs. you,” “me vs. us,” “us vs. it,” “us vs. us” and “us vs. them.” If we look at some memorable examples of these conflicts, we can better appreciate Greene’s prescription for happiness.

Me vs. you. In a 1960s television skit, Red Skelton plays a hungry hobo on a park bench (Freddie the Freeloader) who sees a cake fall out of a passerby’s grocery bag. As Freddie gets ready to cut the cake, the other hobo on the bench insists that Freddie share the cake with him, saying, “If we are to split the cake fairly, you should let me cut the cake.” Freddie shrugs and hands him the cake, which he cuts into a big piece and a little piece, keeping the big piece for himself. Freddie objects, saying, “That’s not fair. Why, if I had cut that cake I would have given you the bigger piece.” The sly hobo indignantly retorts, “Well, what are you whining for? That’s exactly what I did.” The issue here is clearly individual selfishness.

Me vs. us. After the famous mutiny on the Bounty in 1789, the mutineers set Captain Bligh and 18 loyal crew adrift in the Pacific Ocean in a 23-foot boat with meager provisions. Bligh and his crew soon faced a classic potential “tragedy of the commons” in which selfishness by some can lead to disastrous results for all. Miraculously, aside from one crewman killed by natives of Tofua, all of the Bligh loyalists cooperated well enough to survive an arduous 47-day voyage to the island of Timor in the Dutch East Indies.

Us vs. it. In the movie All Is Lost, Robert Redford plays a single-handing sailor who awakes to find that the hull of his sleek ocean-going sailboat has been fatally gashed by a large metal shipping container floating in the middle of the Indian Ocean. As the bobbing sailboat’s cabin floods and ruins his high-tech equipment and water supply, the solitary Redford realizes his desperate plight against a relentless, amoral opponent, the sea. A violent storm sinks the sailboat, leaving him in a flimsy raft. His only real hope is to raise the awareness of strangers to his plight, but crew on the containerized cargo ships in the area pass unaware of him. This allegory about the peril of disregard for the environment presents a cautionary tale about us vs. it.

Us vs. us. George Washington’s farewell address is devoted to warning about the divisive perils of factions and party politics. It is read ceremonially each year to Congress and bears repeating here:

[Parties] serve to organize faction, to give it an artificial and extraordinary force; to put, in the place of the delegated will of the nation, the will of a party, often a small but artful and enterprising minority of the community; … [parties] are likely, in the course of time and things, to become potent engines, by which cunning, ambitious, and unprincipled men will be enabled to subvert the power of the people, and to usurp for themselves the reins of government; destroying afterwards the very engines, which have lifted them to unjust dominion.

Amazingly, Washington’s prescient words foretell such momentous events as the Civil War and Hitler’s rise to power. Even today, our staunchly partisan Congress fiddles myopically while major problems continue to burn. Greene points out that, unlike the world’s major religions, the diverse tribes that make up the American public lack a single unifying moral code, an “us.” Instead, our polarizing two-party system, recurring re-election campaigns and the Citizens United decision provide a perfect nest for amoral corporate lobbyists who fulfill the role of the “enterprising minority” described by Washington.

Us vs. them. These are the titans of conflicts, occurring usually between nations and religions. Examples abound, and include all wars, genocide and terrorist group activity. Greene gives these conflicts the most attention, reminding us that political and religious strife tragically killed 230 million people in the last century.

Of the five types of conflicts, Greene admits that the lower-stakes “me” types are usually solved by our innate moral compass, our gut feelings, our heart. But the larger “us” type, he says, urgently demand a different moral compass, one with a mindset toward utilitarian solutions–solutions that impartially seek the greatest happiness for all. This dual-process approach to morality is the main message of Moral Tribes. Greene presents extensive results from psychological thought experiments and brain imaging to support the dual-process approach as the best one available.

Greene wants us all to become moral thinkers. Like Damasio, Kahneman and Claxton before him, Greene recognizes that we need both fast and slow thinking to ideally sift our choices from our enormous decision trees. The fast thinking is our knee-jerk parental morality. The slow, deliberate thinking is how we cooperate on more complex issues. It is the tool one uses in chess to plan moves ahead, to foresee outcomes and consequences.

Green gives cooperation the highest respect: “From simple cells to supersocial animals like us, the story of life on Earth is the story of cooperation. Cooperation is why we’re here, and yet, at the same time, maintaining cooperation is our greatest challenge”. Morality, writes Greene, evolved because “cooperation by individuals conferred a survival advantage to their group” or tribe. The problem came when such tribes grew large and began to interact and selfishly compete. (Oddly, Greene makes no mention of Harvard’s E.O. Wilson, the pioneering authority on the evolution of social cooperation.)

Selfishness–that abomination to Aristotle, yet the engine of progress to Adam Smith and Ayn Rand–underlies each of the five types of social conflict. It is how each of us inherently perceives the world. The Golden Rule, for most of us the touchstone of morality, works because selfishness can be seen in others better than in oneself. If we had a selfishness thermometer whose spectrum ran from greedy to supremely generous, most of us could place people we know somewhere along that spectrum. Greene presents research to show that one’s place on that spectrum is closely related to which “tribe” we are from. . .

Greene enumerates 16 human factors that guide our moral impulses, ranging from empathy and love to embarrassment and righteous indignation. These factors collectively determine our sense of decency and our innate reactions to cheaters in our tribe. The human tendency to gossip and rumors, says Greene, extends these reactions widely across a tribe or community, and ensures effective accountability for everyone in the tribe.

Our sense of decency makes us care that people have rights to things that were unjustly taken from them. We do what we can to atone for past sins; but Greene would have us shorten the scorecard for past wrongs by employing another moral principle: forgiveness. Our tendencies for forgiveness, he writes, “are adaptive strategies in a world where mistakes happen.” He suggests that all legacy rights must be tempered with an overriding rule to seek the greatest future happiness.

Greene devotes a good deal of space to discussing the goodness of utilitarianism and defining its goal of happiness as being far more generic than smiles and plenty. Greene wants utilitarianism to be embraced as everyone’s common moral currency. He makes this seem reasonable and acceptable at first blush. However, philosophers like John Rawls point out that utilitarianism can violate the rights of the individual, rights that many philosophers and religions consider sacrosanct. Greene dismisses the sanctity of individual rights as an annoying impediment to negotiating best-case solutions to titanic problems. When an opponent objects that a solution violates some right(s), Greene says, that opponent is seeking to effectively end the argument by inserting an ingredient that cannot be decided by evidence. To many, this will be a key weakness in Greene’s approach. Again, the innate moral compass runs deep. . .

Greene communicates well, with a just-right mix of formality and jargon. He personalizes in places and confronts current events with his own frank views. This approach makes the book flow with understandable meaning.

He concludes with six rules for solving moral problems. The rules can be paraphrased as follows:

• For “me vs. us” disputes, use your fast, innate moral compass–what your parents taught you as right and wrong.

• Do not use “rights” in arguments or disputes. Though they feel like a trump card, rights are abstract and not amenable to reason by evidence.

• Focus on facts and evidence regarding the actual consequences of proposed policies. Include both primary and secondary consequences.

• Beware of insidious and obvious “biased fairness” in all positions taken.

• In “us vs. them” disputes, use the common moral currency of maximizing happiness by employing the common factual currency of science.

• Give. The affluent need to make sacrifices to help the less fortunate.

Greene’s comprehensive treatise to develop a scientifically supported, universally accepted set of rules for negotiating maximum happiness is heroic, if naive. Such rules, like Robert’s Rules of Order, would be a great help if widely ratified by diplomats, elected officials, school boards and church councils. It remains to be seen if these rules can win the kind of commitment necessary for resolving deep conflicts about abortion, evolution, climate change or the distribution of wealth. One hopes that Greene’s next book will map a way to win such commitment.

Book & Cinematic Reviews

Past Issue

In Search Of A New Health Care Model: The Healing Of America

The Healing of America: A Global Quest for Better, Cheaper, and Fairer Health Care

Marin Medicine – The magazine of the Marin Medical Society

TR Reid, 304 pages, Penguin (2010).

CURRENT BOOKSPeter Bretan, MD

The United States spends the largest proportion of GNP (17%) for health care of any country in the world, without providing universal health care. The percentages of GNP for countries that do provide universal care include France (11%), Switzerland (10.8%), Germany (10.4%), Canada (10.1%), UK (8.4%), and Japan (8.1%). Japan spends $3,400 per capita on health care vs. $7,400 for the U.S. . . .

Why is our health care so inefficient and fragmented? More importantly, will we fix it before it collapses? In his book The Healing of America, TR Reid proposes that we cannot begin to answer any of these vital questions unless we know what kind of health care system we have. . . In a[n] attempt to do so, Reid explores health care systems throughout the world to compare government oversight (degree of regulation and support), payer mix, cost efficiencies and many other factors that may elucidate possible solutions to our dysfunctional health care system.

Reid’s contemporary classic is a must-read for all Americans, as health care is now at the forefront of American politics. An inevitable collapse from the growing cost of care will affect everyone. As health care professionals, our patients depend on us for health care policy guidance. Unless all physicians understand basic health care structures, we cannot begin to save our health care system.

In The Healing of America, Reid—a well-known reporter, lecturer and documentary filmmaker—describes four distinct health care systems: the German Bismarck model, the UK’s Beveridge model, Canada’s national health insurance model, and the out-of-pocket model.

The Bismarck model was started by Otto von Bismarck, the first chancellor of Germany, in 1883. Today health insurance in Germany is provided through employers for 82 million citizens and millions of guest workers, both legal and illegal. There are 180 insurance companies, whose income is supplemented by a 15% income tax specifically for health care. Medical education is free. Medical malpractice premiums are less than $1,400 per year, but physician specialists make less than half as much as their American counterparts. Costs are rising and sustainability is questionable. The Japanese and Taiwanese health care systems are similar but less costly because they have more rigid cost controls via a single national fee schedule, at the expense of physicians and hospitals. All these countries have an individual mandate and are happy with their health care.

The UK’s Beveridge model covers 90% of the population via a national 17.5% sales tax and is administered via their National Health Service. The model is single payer and truly socialized. Cuba has a similar system. Primary care physicians are the gatekeepers for most care; pay-for-performance bonuses can double their income. Medical education is free, but certain procedures and tests are only used for high-risk patients. Many cancer drugs are just not covered.

Canada’s national health insurance model began in Saskatchewan in 1946 and spread nationally, culminating in the Medical Care Act, unofficially known as Medicare. The United States later copied this system and name for American citizens 65 and older. The man most responsible for the Canadian system was their national hero, Tommy Douglas, who waited years to get a common orthopedic procedure performed under the old system. The new system covers basic care for everyone and is equally available to both rich and poor. This egalitarian pride sustains the system, but long wait times for elective procedures are common.

In the out-of-pocket model, the rich get medical care, while the poor stay sick and/or die. Most poor countries use this model, including China, which is moving back to out-of-pocket care. In the United States, 20,000 out-of-pocket poor people die annually from easily preventable or treatable diseases.

The United States uses all four of these models. We have Medicare like the Canadians for our elderly population. We have government-run, UK type of care from our veterans and for certain diseases, such as end-stage renal disease. We have multiple insurers, but these companies are not regulated to the same extent as in countries with a Bismarck model. Finally, we have an out-of-pocket model for 23 to 40 million Americans, all of whom suffer from a lack of access to care. This uncoordinated and fragmented health care system has led to inefficient, overlapping care for those with health care coverage, and to no care for millions without any coverage.

The United States is the only developed country in the world with for-profit health care insurers. In single-payer countries, such as Canada and the UK, these companies simply do not exist. In multipayer countries—such as France, Germany, Japan and Taiwan—the insurance companies are all nonprofit, charitable organizations, and are all government regulated and highly efficient, without the enormous bureaucracy found in the United States. Reid makes a compelling argument that “You can’t allow a profit to be made on the basic package of health insurance.”

Reid acknowledges that for those with money to pay for basic health insurance, American medical treatment is the best on the planet. Many Americans, however, go without insurance. Prior to the Affordable Care Act, 45 million Americans (15% of the population) were without any medical insurance. After the ACA is fully implemented, an estimated 23 million Americans (8% of the population) will still be without insurance. These patients constitute a high proportion of the 700,000 Americans who go bankrupt each year from out-of-pocket costs.

Despite our country’s enormous spending on health care, our infant mortality rate is one of the highest in the world at 6.37 per 1,000 births, compared to 2.76 for Japan and 5.0 for the UK. The World Health Organization ranks the U.S. 37th in the world in overall health indexes. Much of the low rating is caused by lack of access to care.

Reid dispels many myths about foreign health care. He refutes untrue statements, made by former New York City mayor Rudy Giuliani and others, that European health care could not work in the U.S. because the Europeans severely ration all care. He also debunks claims that the World Health Organization data is too liberal and that countries listed as more efficient by WHO criteria are all “socialized” in their health care. Reid explains that this is simply not true, pointing out that Japan has more for-profit hospitals than the U.S . . .

ED: We had to do a cross check to make sure this wasn’t Harry Reid. This book is out dated and the reviewer is biased. When we go to international meetings, we do not find the physicians from afore mention countries happy with her government regulated or socialize programs. Breton states that Reid dispels many myths about the WHO being too liberal. But we have not found anyone that would even consider going to Columbia, as I recall, for heart surgery which is ranked above the United States. I’m sure this reviewer wouldn’t either. Breton does get the wait times essentially correct. But he highlights the wrong orthopedic procedure. The orthopedic patient wait that went to the Canadian Supreme court was the ruling that determined that Canadians did not have access to health care. They only had access to a waiting list. That certainly turns everything on its head that Canadian Medicare delivers good healthcare. Prejudices hardly ever die. They don’t even fade away after all these years. Some of these don’t even fade away before the history books are written.

Canadian Medicare does not give timely access to healthcare, it only gives access to a waiting list.

My Father-in-law, the late Dr. Lyle Powell Jr., taught me the axiom that “to best understand something complex, one should examine its extremes.” This examination not only clarifies the thing’s limits; it reveals what inner workings impose those limits. That revelation is precisely the result when this axiom is applied to human physiology by Dr. Kevin Fong in his engaging new book, Extreme Medicine.

Fong’s book provides the reader with a fascinating tour of the emergence of modern medicine from a brutish past in which war, disaster, epidemics and the perils of exploration all pressed pioneering physicians to try new things. Fong is a master storyteller, and his prose delivers a captivating and punchy mix that is part Discovery Channel and part Rod Serling. His writing style bespeaks of British formality, impeccably correct and at times reminding one of a dialogue in Downton Abbey. His medical training in London gives much of the book the tone of a lecture by an emeritus professor. Yet his tales have a dramatic and personalized intimacy, both from his own experiences and those of real historical figures. These stories are thrilling. They put the reader right there, as if personally confronting the acute, life-threatening medical problem and having to make the daring decisions about what should be done.

Extreme Medicine will reward both lay readers and those in the medical profession. For physicians, this book will summon again the awe and “aha” that we felt at the new insights into human physiology that were bestowed upon us during our medical training. When Fong recounts the frantic, midnight rush to respond to a code blue crash-cart experience, he instantly transports the physician reader back to those breathtaking codes that we attended as interns, where some of those insights were etched into us.

Time and again, Fong removes the reader from the immediate crisis to the comfort of a crystal-clear, academic retelling of the underlying physiology that pertains. This technique is quite effective, and it intensifies our appreciation for the importance of the miraculous cellular and molecular workings of human physiology. As such, this book will improve every physician who reads it.

Fong was born and raised in the UK. He has an ideal résumé for writing this book, with degrees in both astrophysics and medicine from University College London (UCL) as well as a degree in astronautics and space engineering from Cranfield University. He worked with NASA at the Johnson Space Center, as well as serving as medical officer for deep-sea diving expeditions. He completed training as an anesthesiologist and now teaches as honorary senior lecturer in physiology at UCL.

Fong begins Extreme Medicine with a thoughtful description of what life really is, deconstructing it as an active, organized storing of potential energy as charged ions are corralled behind a cell membrane, and then the periodic release of that energy when those ions cross the membrane with purposeful effect. This struggle against entropy, this straining to postpone collapse into equilibrium, is what makes us us, he says. Such a fundamental definition of life exemplifies the depth that Fong applies to his other explanations in the book.

Fong vividly sets the scene in each of a series of cases in which the intrepid doctors of yore are forced to solve terrible medical problems by trial and error. What emerges from the ways in which pioneering doctors deal with freezing, drowning, burning, dismemberment, shock and weightlessness are discoveries that make modern medicine the miraculous thing it is today. Each case is told with dramatic detail, such as that of Tom Gleave, an RAF fighter pilot in World War II whose Hawker Hurricane, when struck by machine gun fire from a German bomber, catches fire and inflicts Gleave with severe burns before he can bail out. The following excerpt conveys Fong’s writing talents and the intensity of his subject:

Gleave glanced down. Flames were pushing into the right side of the cockpit from below; the fuel tank buried in the root of his starboard wing was alight. He rocked the Hurricane hard and slipped it sideways in the vain hope that this would somehow quell the fire. But the flames only grew fiercer, wrapping around his feet and climbing to reach his shoulders. Plywood and fabric burst rapidly into flames around him, accelerated by fuel from the breached tanks. In a few short seconds, the center of Gleave’s cockpit had become the head of a blowtorch. The aluminum sheet in which the dials of his control panel were set began to melt. But he was far too high to ditch the aircraft; there was nothing he could do but attempt to bail out.

Fong goes on to describe Gleave’s miraculous survival and arduous recovery after extensive, pioneering plastic surgery, including breakthrough techniques in preserving graft vascularity. He includes other examples of burn therapy and reconstruction that exemplify the enormous medical progress driven by injuries in war and industrial accidents. . .

Extreme Medicine is also available as an audio book, a 6-CD set elegantly narrated by the very British Jonathan Cowley. The audio book can be enjoyed by busy physicians while driving a car.

I highly recommend this book to anyone who wants to better understand the body in which they live.

Book & Cinematic Reviews

Past Issue

WHO OWNS YOUR BODY?

In 1993, Dr Madeleine Pelner Cosman, a health care attorney, reviewed Medicare and Medicaid litigation and legislation from their beginnings. She was startled to discover that the law most of us accepted as primarily gentle civil law had altered incrementally to brutal criminal law. Physicians are held to vague, arbitrary standards that provide accused doctors fewer rights and defenses than accused murderers, rapists and arsonists. If convicted, physicians were punished more harshly than the vilest criminals.

Cosman’s thesis for this volume implies that medical criminal law now poses a clear and present danger both to physicians and to patients. This new type of law aims to eliminate fraud in the government medical system, but lurches wildly into the personal rights of each American doctor and patient. It also collectivizes patients under “capitation” formulas (paying health care providers a certain amount per patient without regard to how many or how few services are provided) and bureaucratic decisions of “medical necessity,” placing at risk patients’ privacy, confidentiality of medical records, individual medical choices, personal liberty, and bodily integrity.

She warns that members of Congress and public health experts who favor a single standard of medical care for all, regardless of individual patient requirements, use the criminalization of the physicians and collectivization of patients to propel America to a government-controlled, single-payer national health care system.

Cosman contends that American Medicine is manacled with so many regulations that she’s calling Americans to alarmed a1ert. Physicians must obey 132,720 pages of government medical directives, laws, rules and regulations, including 111,000 pages of rules specifically controlling Medicare. The law governing medicine has shifted since 1965 from civil law to criminal law. The law was temperate civil law when Medicare and Medicaid were implemented. It was not coercive. It did not meddle in physicians’ decisions. It did not intrude in medical offices or hospitals. Section 1872 of the Social Security Act was thought to be strong enough to prevent, curb and catch potential frauds and abuses. Fraud required intent. No doctor could be prosecuted for medical fraud unless he knew a particular act was wrong and he did it willingly and intentionally.

But legal intentions and social suppositions changed as medical costs rose meteorically. Incrementally major laws have become more restrictive, oppressive and punitive. They incorporate increasingly sterner reporting standards and more vindictive criminal punishments. In chapter one, Attorney Cosman lists the eight laws and discusses what is forbidden, defining what is progressively more arbitrary and, therefore, more likely to entrap the innocent. The ordinary street thug must intend his vicious crime and actually do it. A doctor need not intend to defraud and need not perform the criminal act to be liable and jailed. Medicare fraud can be accidental. Fewer and less certain criminal procedures protect the constitutional rights of physicians than those of the professional thug, arsonist, rapist or murderer.

Cosman discusses the six white-coat crime hazards and the dubious nature of Medicare fraud. The Office of the Inspector General had previously estimated Medicaid fraud at 2 percent. The Health Care Financing Administration had estimated medical crime even lower, or 0.44 percent. An Assistant Inspector General admitted that he got a call from the Health and Human Services Secretary’s Office saying that he would be giving a speech in nine days and wanted an estimate of waste, fraud and abuse. A number of people sent in figures because they had to. He guessed the Secretary’s people just added up all the guesses and came up with an illusory 10 percent fraud idea. So the 10 percent is only a political proportion, not a statistical certainty. Some of the estimates were 1 percent, 0.1 percent and 0.01 percent. But it was an effective political statistic with no empirical foundation. It was so effective that 72 percent of retirees in an opinion poll of the American Association of Retired Persons (AARP) believed that if medical fraud were eliminated, Medicare would not go broke.

Cosman proceeds to give us a well-referenced report on a large number of actual instances of what passed as Medicare Fraud, such as inadequate or improper recording of information or codes, or providing uncovered services even if appropriate and helped patients. What Medicare considers lack of necessity often means only that Medicare does not want to pay for that medical or surgical procedure. Medical fraud under current medical law can be unintentional and trivial, not material, and not harm–can actually help–the patient, if the crime hurts the medical program by charging for the patient’s care.[baf1]

Medical fraud holds triple terror for a medical professional. First is the ease of conviction for alleged frauds that are not intentional. Second, many honest acts can be misinterpreted as medical fraud. Third, medical fraud under certain laws such as the False Claims Act has high, pre-established, statutory penalties “per incident” plus triple damages. Therefore, a small alleged fraud is punished harshly. A doctor who accidentally uses the wrong reimbursement code 100 times for a simple medical procedure costing $100 suddenly is worth a lot to prosecutors. Triple damages for each incident, $300 + $10,000 penalty = $10,300 per patient x l00 patients, escalates the fraud to $1,030,000. The more money involved in a fraud, the more severe the punishment after conviction. The more money involved in an alleged medical fraud, the better the press elevates the stature of the prosecutor. The more money involved, the more newsworthy the crime.

This volume is a veritable repository of essentially all the challenges in health care we face today and why. There is a chapter in which Cosman answers the frequently asked question, “Is Health Care a Basic Right?” Another important chapter on how health insurance shackles your employer discusses the cause of the colossal national deception that employers rather than employees actually pay for health insurance. And in her ninth and final chapter, Cosman gives numerous examples of how the return to choice in health care has unshackled American Medicine with Patient-Centered affordable health care.

Who Owns Your Body?: The Shackling of American Medicine is dedicated to helping the excellent, ethical physicians and surgeons who are caught by laws that are vague, arbitrary, illogical, capricious and vicious. Also at risk for accidentally violating medical criminal law are psychologists, pharmacists, chiropractors, podiatrists, audiologists, and physical, speech and occupational therapists.

Cosman asks if a physician does not own his or her own medical mind, who owns it? If a medical patient does not own his or her body, who owns it? Who should decide whether or how much money should be spent to save a patient’s life? Should a patient have the right to spend personal cash to protect the body he or she owns? These are not irrelevant questions. Medicare patients in both the United States and Canada do not have this option.

Americans merit the best medicine of a free society. It charts a path to medical abundance, medical integrity, and medical excellence. We have the courage, the intelligence and the rights to buy it. Who Owns Your Body?: Doctors and Patients Behind Bars reveals deceptions and misconceptions in modern American medicine. It should be read by every physician and health-care provider. Not to be aware of the issues in this volume, with actual examples of physicians in jail who have never given improper care, could cause you to lose your license, home, family and also place you behind bars.

We have come to a crossroads in America, both in education and medicine. Baylor sold its religious heritage to acquire federal funds. Hillsdale College, which avoided all federal funding, found even accepting veterans on a GI bill constituted government largess. They raised enough donations to fund every student that wanted to enroll to replace any government loan, scholarship, or any government program the student had acquired to keep the federal government from meddling into their educational affairs. Now health care has found that federal involvement can be a lethal arrangement. It is critical that all professions extricate themselves from government control. Physicians and all health care workers will want to seriously consider opting out of all federal health care programs as soon as they can safely do so without abandoning or harming their patients. Keeping 10 percent of their Medicare patients as charity cases may be less expensive in the long run than staying in Medicare.

Book & Cinematic Reviews

Past Issue

The WSJ Bookshelf By Barton Swaim

The typical public apology purports to be an expression of regret and self-reproach, but in fact is meant to defend and justify

Public apologies might not be so nauseating if there weren’t so many of them: Corporations apologize for real and imagined misdeeds; celebrities apologize for drunken tirades; and politicians apologize for nearly everything. Their aim, you feel, isn’t to express genuine remorse or accept blame but to make the offense go away as quickly as possible. In 10 short chapters that examine scores of public apologies, Edwin Battistella’s readable and incisive “Sorry About That” explains why some apologies succeed, or at least avoid exacerbating the original offense, but most of them fail.

The problem begins with the word itself: What is an apology, anyway? The classical meaning was a defense or explanation; the modern definition emphasizes the mental disposition of sorrow, regret and self-reproach. The typical public apology purports to be an apology in the modern sense—assigning blame to oneself, pleading for forgiveness—but in fact is meant to defend and justify. . .

Sometimes it’s not clear what is and isn’t an apology. After Hillary Clinton‘s health-care initiative failed in Congress in 1994, the first lady said in an interview: “I regret very much that the efforts on health care were badly misunderstood, taken out of context and used politically against the Administration. I take full responsibility for that, and I’m very sorry for that.” Clearly she meant to blame others for the bill’s demise, but was she apologizing for letting them do it? . . .

Book & Cinematic Reviews

Past Issue

Gulp: Adventures On The Alimentary Canal

By Mary Roach

CURRENT BOOKS: Chewing the Fat

By Jeff Sugarman, MD

Gulp, the new book by science writer/humorist Mary Roach, offers an entertaining if somewhat meandering and tangential tour of the alimentary canal. From top to bottom Roach takes us to places we never knew existed, and she digs down deeply into the often odd and esoteric research of those committed to exploring where no one has wanted to go before.

Roach reminds us that taste is all about smell, especially if you are a dog. As a dog owner, I found her expose of the Palatability Assessment Resource Center (PARC), a pet food tasting research center, fun and interesting. She opens a strange and charming window into the life of a professional pet food taster. Who knew that if a pet food manufacturer wants to make a claim that dogs prefer brand X of kibble, the manufacturer must actually get data to support their claims at a lab like PARC?

Roach’s exploration of the science of saliva is quite entertaining as well. I was startled at the strangeness of some of the scientific projects she describes. For example, how does salivary breakdown of starch enhance flavor? Subjects in one study had to rate the taste of custard samples. Sounds like a great study to volunteer for, right? What the subjects did not know was that a drop of saliva was secretly added to their meal. Roach does not go into detail as to how the saliva samples were actually obtained.

The indefatigable Roach amasses so many fun facts that weaving them together into any kind of coherent story at times proves too difficult. The transitions from fact to factoid are often forced and create a zigzagging story line that dilutes from the theme she is attempting to illustrate. Her extensive footnoting, which in places seems to take up nearly as much text as the main body of the book, allows her to weave even more tangents into her story. They are often more entertaining than the stories in the main text.

The historical vignettes that provide the backdrop to our knowledge of certain digestive processes are quite compelling. In pursuing the digestive properties of the stomach, Roach explores the relationship between Alexis St. Martin, a trapper who was accidentally shot, leaving him with a fistula between his stomach and his skin, and William Beaumont, a researcher who experimented on St. Martin for several decades. The reader may forgive Roach for describing their relationship as “acid” because the interplay between these two and the resulting digestion experiments are so entertaining.

Starting with Moby Dick, Roach spends many pages on the historical pseudoscientific studies regarding the survivability of being eaten alive. “Would a man in a whale forestomach be crushed or merely tumbled?” she asks. The whole discussion seems ridiculous, but she does include some interesting research, including the work of 18th-century French naturalist Rene Reaumur, who studied raptor gizzard pressures using a small tube carrying meat.

My 15-year-old son would probably enjoy the sections describing the curiosities of the rectum. Here, Roach footnotes the work on gastrointestinal gas by Drs. Terdiman and Fardy. Similarly, the section on flatulence research would provide Daniel Tosh plenty of fuel for his stand-up jokes on “Tosh.0.” Of course Roach also can’t resist the stories of objects found by emergency department physicians that are stuck or lost in the anus.

Roach is also strangely attracted to macabre events, which she describes and embellishes with pithy details. In her section on the compliance of the stomach, she describes a woman whose stomach ruptured from overeating and the man who ate 18 pounds of cow brains. There is much to be learned here, although most of it may be utterly useless.

On the back cover, the book carries the label “science,” which is true in the loosest sense only. Gulp is not a serious book, notwithstanding the extensive referencing. However, how interesting could the alimentary canal really be to the lay reader? Roach cleverly solves this problem with humor, effectively holding our interest in areas that are often off limits. She approaches the very end of the alimentary canal, for example, by relaying interviews with prison inmates about the details of smuggling cell phones in their rectums. I can imagine Roach laughing to herself as she wrote this, and for that matter every page in this book, from the origins of fire-breathing dragons to the Bristol stool chart (complete with diagram).

Book & Cinematic Reviews

Past Issue

The Bookshelf By Barton Swaim, WSJ

Book Review: ‘Poems That Make Grown Men Cry,’ edited by Anthony and Ben Holden

You don’t need a degree in creative writing to be brought to tears by verse.

Terry George, the Irish screenwriter and director, chokes up whenever he reads Seamus Heaney’s “Requiem for the Croppies.” The sonnet is an acutely condensed retelling of the 1798 Irish rebellion, a series of battles in which an army of mostly peasants—”the pockets of our greatcoats full of barley”—tried to throw off British rule. He’s right; the last three lines, recalling the rebellion’s final battle on June 21, catch in the throat:

The hillside blushed, soaked in our broken wave,

They buried us without shroud or coffin

And in August . . . the barley grew up out of our grave.

Mr. George is one of the 100 men Anthony and Ben Holden queried for their anthology of “Poems That Make Grown Men Cry.” The editors aren’t trying to make the case for poetry—perhaps a hopeless task in our time—but the book does it anyway. Poetry, so easily assumed to be merely weird self-expression since the death of rhyme and meter, isn’t that at all: It’s the arrangement of language into rhythmical structures to make it say what it can’t say otherwise. The Holdens remind us that you don’t have to be an academic or a postgraduate in creative writing to be moved by verse. Or, indeed, brought to tears by it.

The editor Harold Evans couldn’t fight them back reading Wordsworth’s “Character of the Happy Warrior” at a colleague’s funeral. The critic Clive James sheds them for his parents at “Canoe” by Keith Douglas. The novelist Sebastian Faulks cries over Samuel Taylor Coleridge’s “Frost at Midnight” (a marvelous poem—though not, I would have thought, one likely to induce tears). Despite the slight hokeyness of the whole idea, the overall effect is to make excellent poetry seem like what it is: a wholly accessible language with its own range of expression and its own pleasures. . .

Book & Cinematic Reviews

Past Issue

The Lost Cause: The Trials Of Frank And Jesse James

A lawyer’s successful search for a missing court case sets straight the crooked tale of Frank and Jesse James

By Steven Hill

“The Ballad of Jesse James”

Jesse James we understandHas killed many a manHe robbed the Union trainsHe stole from the richAnd gave to the poorHe’d a hand and a heart and a brain.

James Muehlberger was down to the final day of his three-month sabbatical, and the county clerk’s office in Gallatin, Mo., was due to close in 5 minutes. He had spent the past week hunkered down in the dusty office, rifling through drawer after drawer of legal files. Now it was 4:25 on a Friday, and he still hadn’t found the document he was searching for. In fact, he’d been told he wouldn’t find it.

“The clerk told me I was crazy, that it didn’t exist,” says Muehlberger, c’78, l’82, “and if it had existed it had been stolen or preserved [elsewhere] because anything related to Frank or Jesse was long gone from their files.”

But Muehlberger—a former Johnson County prosecutor who now defends corporate clients as a partner at Shook, Hardy & Bacon (SHB) in Kansas City—looked anyway. As Theresa Hamilton, deputy for the Circuit Court of Daviess County, began buttoning up the office for the weekend, he raced to finish one last file drawer. There, at the very back of the drawer, he recognized the prize he sought, a dusty, barely legible folder that he’s convinced no one has seen since 1870: the lawsuit file for Daniel Smoote v. Frank and Jesse James.

“Finding that was probably the most exciting thing I’ve done as a lawyer,” he says. “Part of what I do is spend months or years looking for the smoking gun document that’s going to make my case, or trying to find witnesses who don’t want to be found. Basically I used the same sort of skills I developed over 30 years of being a lawyer and applied it here.”

The find confirmed a story Muehlberger had heard around SHB’s Kansas City headquarters, that a lawyer named Henry McDougal, associated with a founding partner of the high-profile firm, had once sued the notorious Missouri outlaws.

The case and the crime that spurred it—the murder of a former Union officer and Gallatin bank clerk named John Sheets—marked the first time the James brothers gained notoriety for their crimes, and the media attention was the beginning of the enduring Wild West legend of Jesse James as a “noble robber,” a chivalrous farm boy who fought for Southern honor during the Civil War and after was driven to crime to battle corrupt pro-Union politicians.

The discovery of the lost lawsuit was one in a series that led to Muehlberger’s book, The Lost Cause: The Trials of Frank and Jesse James, a thoroughly researched and carefully argued chronicle of the decade-long quest to bring to justice one of the most feared—and revered—outlaw gangs in the West. The Kansas City Star named it one of the best 100 books of 2013, and the New York Times Book Review credited Muehlberger for creating a story that is “equal parts violent melodrama and meticulous procedural, wrapped in vivid packages with enough bloody action to engage readers enthralled by tales of good versus evil.”

Don’t be fooled by the book’s cover: The jacket features a sepia-toned photograph of a fierce, pistol-brandishing Jesse James, but the true heroes are the lawyers who took on the infamous Missouri outlaw and his brother Frank. . .

Jesse James had a wifeTo mourn for his lifeThree children, they were braveBut history does recordThat coward Robert FordHas laid poor Jesse in his grave

What Doctors Feel: How Emotions Affect The Practice Of Medicine

It seems the American public is yearning to figure out what makes doctors tick. First came How Doctors Think (2008) by Dr. Jerome Groopman, followed by What Doctors Feel (2013) by Dr. Danielle Ofri. According to Amazon.com, these two books are “frequently bought together.” They represent the ying and the yang of the physician psyche, one a guide to how our minds work, and the other a road map to our innermost feelings. From a patient’s perspective, there should be some powerful insights offered here. Based on the coordinating titles, one wonders if Drs. Groopman and Ofri got together over coffee one morning to decide who should publish first. His quote graces the cover of her book, endorsing it as the place “where science and the soul meet.”

Dr. Ofri has an MD and a PhD, and she completed a residency in internal medicine. She is the mother of three children, a working physician and writer, and an associate professor at New York University School of Medicine. Her inspiration for What Doctors Feel comes from patients she has cared for as a faculty member at NYU’s internal medicine residency at Bellevue Hospital. From her writing, it is clear that she has charged herself with teaching the psychosocial side of medicine to her students and residents. Rather than treating a patient with alcohol and drug withdrawal as just another admission, she probes to discover the exact moment in the past when the patient knew he was an addict, and she gets a moving response. Her underlings treat the patient with more concern and compassion as a result.

To a primary care physician in the trenches, Dr. Ofri’s book has enormous potential and appeal. How do we feel, anyway? Every 15 to 20 minutes, we walk into the next patient’s exam room. Each one has a chief complaint, or more likely, many complaints. It is our job to elicit information, show compassion, cure, heal, fix. And in family medicine, which many of us practice and teach in Sonoma County, there is always more than one patient in the room. The accompanying child, parent or partner also has a complaint, but not an appointment. How do we feel? Rushed, overwhelmed, concerned, altruistic, and often fortunate to be doing such challenging and beautiful work. Surely this book can offer us a road map for how to get in touch with our emotions, avoid burnout, remember the psychosocial perspective in caring for patients, and carry on . . .

What Doctors Feel seems to be written more for the lay public than for a physician readership. There is a lot of detail about the process of medical school and residency training. We physicians remember those days like they were yesterday, and the memories are visceral. But residency, as intense and exhausting as it was, had a finite aspect that made it survivable. The practice of medicine over decades is something else entirely.

Here are four examples of what I hoped to get out of reading What Doctors Feel, but didn’t. First, when I see the name of my most challenging patient on my schedule for the day, or on a telephone message, I have an unpleasant internal reaction. However, I still need to provide the best care possible for this person and to put my feelings about them aside. Is this possible? Second, my clinic has just adopted a new patient portal, through which all my patients can contact me via email. What if a patient emails me with an urgent concern when I am not close to the computer? Also, do I wish to spend my leisure time, already limited, responding to emails from my patients? Third, there are work-hour restrictions for residents, but not for attending physicians. When one has been up all night working in the hospital, it is nearly impossible to show empathy to patients by the next afternoon. Fourth, our healthcare system has incentives in all the wrong places, leading to poor outcomes, poor care and poor morale among physicians. What will it take to turn this around? . . . Read the entire book review. . .

Book & Cinematic Reviews

Past Issue

A Tale Of Two Steve’s

Perhaps the first clue to how much Steve Jobs thought of himself is his choice of biographer: Walter Isaacson, the same man who wrote biographies of Albert Einstein and Benjamin Franklin. Did Jobs consider himself in the same league, innovatively and historically, as these two? Yes, he almost certainly did. Perhaps the ultimate measure of his grandiose audacity is that he was probably right.

Few individuals have altered and shaped the fabric of our daily lives more than Steve Jobs. We’re talking here of an impact on the scale of people like Thomas Edison and Henry Ford. Look around and you’ll see, within arm’s reach, products of his creation that literally touch everything we do. The laptop. The cell phone. How we listen to music. How we communicate.

Jobs didn’t do this alone. But as he stood at the convergence of information technology and the creative arts, during an historic moment as transformative as the industrial revolution, he more than any other gave expression to the products of information technology that have become embedded into our daily lives.

How did he do this? And who was the man who did it? The answers, as revealed in Abramson’s biography through hundreds of hours of interviews with the people who knew him, are predictably complicated.

Jobs had a ferocious, even obsessive, will that could drive him to the exclusion of everything else, including reality. Those who worked with him all speak familiarly of what came to be called “Steve’s reality distortion field.” Often if a fact or situation interfered with his vision, he simply wouldn’t acknowledge its existence. This became a double-edged sword. On the one hand it led him to achievements others in the industry considered “impossible,” such as wresting control of recorded music from Sony and repackaging it as iTunes. On the other, it allowed him to deny and virtually abandon his daughter born in 1978, curious behavior for a father who resented his own biological parents for putting him up for adoption at birth.

As with most genius, there comes idiosyncrasy. Jobs’ creations all bore the same signature characteristics. They were elegant, durable and extraordinarily functional. But he went beyond that. They had to be aesthetically pleasing inside the locked compartments that were never visible to consumers. Even the machinery with which they were manufactured had to be of a certain color and decor.

. . . He once defended his disdain of focus groups by saying they were irrelevant: “People don’t really know what they want until I show them.”

The mark of Jobs’ personality persists in all his creations. Every Apple product is a “closed system.” No hardware may be added. No screwdriver can take it apart. . .

To work for Jobs was a mixed blessing. At meetings he could rant, cry, berate and belittle employees publicly, sometimes all at once. His intensity was legendary. He would sometimes hold a person in an unnerving gaze, without blinking, for several minutes at a time. To employees he was often not merely rude or dismissive, but cruel. Curiously, he carried Yogananda’s Autobiography of a Yogi with him most of the time, and he reread the book once a year. It is said, by those who knew Yogananda, that he could enter a room and fill it with calm. It is also said, by those who knew Jobs, that he could enter a room and fill it with ego.

And yet, those who did work for Jobs are in almost unanimous agreement: “Without Steve we could never have risen to our best work, and would have never accomplished what we did.” For me, a baby boomer, the book is not just the story of a fascinating contemporary, but a fascinating story of our contemporary history.

My favorite parts are of the early Steve Jobs. While seniors in high school, Jobs and his wonk friend Steve Wozniak posted computer-generated banners all over campus one afternoon saying, “Remember: Tomorrow is Bring-Your-Pet-to-School Day.” The following day such a menagerie of diverse and squabbling creatures descended on the unsuspecting campus that classes were cancelled, students sent home, and Jobs and Wozniak were suspended.

The two became inseparable. As Jobs recalls, Wozniak was “the first guy I ever met who knew more electronics than me.” Their first collaboration was the Blue Box, a device that replicated the tones that routed signals on the entire AT&T network, and allowed users to make long-distance calls anywhere in the world for free. The two friends once called the Vatican from a phone booth. Wozniak pretended to be Henry Kissinger and asked to speak to the pope. What began as pranks, however, became the template for an enduring partnership. Wozniak was the gentle wizard, coming up with inventions he was happy to give away. Jobs would figure out how to make them into a user-friendly package, market them, and make millions.

Jobs attended Reed College in Portland. He dropped out during the first semester, but he remained there for the next 18 months, auditing courses in Japanese calligraphy and Zen meditation. It was there that he acquired the aesthetic style that shaped all his future creations. The multiple fonts that were part of the graphic interface for the very first Apple computers, for example, Jobs attributes directly to his studies and experiences in Portland. The fonts became standard in the industry.

I find an inherent irony in Jobs’ life and legacy. He was a Zen Buddhist, dedicated to the philosophy and practice of being fully focused in the ever-present moment of here and now. The irony is that he created a technology that virtually guarantees nearly constant distraction in the hands and ears and lives of an entire generation. The average 20-year-old checks his or her handheld device for new messages every 27 seconds. Watch a group of high school students at a table in Starbucks “engaged” in conversation, for example, and see how often their eyes and attention are diverted from the one who is speaking to the palms of their hands.

How many young people are attuned to the sounds of their immediate environs or the world around them? Compare this group to the number sealed off from the world by earphones, and carried by sound to anywhere but the here and now. I was recently blind-sided by a young cyclist who turned, not in front of my car, but into my car. As he bounced off my passenger side door and sped away, I noticed the signature white earplugs that rendered him oblivious to surrounding traffic and the rest of the world. . .

Here’s a guy who goes to Reed College, drops out, takes LSD, studies calligraphy, travels to India, practices Zen, and then returns to the Bay Area to found a company that revolutionizes the practical use of information technology and becomes the richest company in the world. Ultimate poster child of the sixties?

To Steve the genius, I say, “Kudos. You were a master at putting together ideas, art and technology in ways that invented the future. You were living proof of your own motto: The people who are crazy enough to think they can change the world are the ones who do.”

To Steve the jerk, I say, “Why’d you have to be so mean?”

Dr. Flinders, a hospitalist who teaches in the Santa Rosa Family Medicine Residency, serves on the SCMA Editorial Board.

Book & Cinematic Reviews

Past Issue

Handbook On State Health Care Reform

by John C. Goodman, PhD

Chapter 2: PRINCIPLES OF REFORM

What are the principles of health reform? One might suppose they are fairly easy to enumerate and command widespread support. As it turns out, that is not the case. Here are five recommended principles. If they are followed, the odds of successful health policy reform will be greatly enhanced.

Principle No. 1: No One Should Be Denied Basic Care because of a Lack of Ability to Pay.

A good society does not withhold basic health care from people because they lack the resources to pay for it at the time of delivery. This does not imply that people have a “right” to free care. If that were the case, everyone would have a perverse incentive to become “free riders,” wastefully over consuming care at everyone else’s expense. Instead, most people should be expected to pay their own way most of the time. But no one should have to forgo basic care because they can’t pay for it at the time of delivery.

Principle No. 2: Health Care Should Be Provided in a Competitive Marketplace.

The economic definition of efficiency is: Whatever is produced should be produced at minimum cost. Some studies lend credence to the idea that one out of every three dollars of health care spending is wasted. This implies that, in principle, the same health care could be provided for two-thirds the cost. Alternatively, there could be 50 percent more care for the same amount of money. In other markets, entrepreneurs spur efficient production by repackaging, repricing and taking advantage of new products

and innovations. Principle No. 2 is not being followed whenever entrepreneurs are arbitrarily prevented from serving this function.

Principle No. 3: The Appropriate Level of Insurance Depends on the Assets to Be Protected.

If Principle No. 1 is followed, people will not need insurance to receive care. Instead, they will need insurance in order to protect their earning power and other assets from unexpected health care costs. Other forms of insurance serve as a useful guide. The purpose of life insurance is primarily to protect earning capacity against the consequences of premature death.

Accordingly, the appropriate level of insurance depends on current assets and expected income. The purpose of casualty insurance is to protect the value of, say, a home or automobile. The appropriate level of insurance depends on the anticipated risk and the replacement value of the home or car. Similarly, the purpose of health insurance should be to protect assets against unexpected medical costs.

It is a mistake to have a system in which a change of health plans is virtually mandated whenever people change employers. Instead, health insurance should be portable (traveling with the employee from job to

job). Also, it defeats the whole purpose of insurance if premiums can rise in response to an adverse health event. Life insurers do not get to charge more to the insured who get AIDs or cancer. Insurance exists to transfer risk from the individual to an (insurance) pool. Th e price of that transfer is the periodic premium payment. Once the insurance contract is set, the practice of increasing premiums after an adverse event occurs would be like changing the odds on a horse race after the race is underway.2 Accordingly,

people should be able to buy health insurance that is renewable at rates that are independent of adverse health events. In most states, this is required under the laws governing individual insurance. However, such insurance is generally unavailable in the small group market.

Not withstanding all of the above, from time to time people may wish to change their insurance coverage. At that point they should be able to buy real insurance in a real market. It is to everyone’s advantage to be able to face real prices for risk when making changes in insurance coverage. Otherwise, people who are undercharged will over insure, and people who are overcharged will underinsure.

Principle No. 5: Private Insurance Should Be at Least as Attractive as Health Care Provided at Taxpayer Expense.

For many people, the implicit alternative to private insurance is to rely on charity care paid for by others. For those who qualify, Medicaid and S-CHIP programs are alternatives to private insurance. Perversely,

these alternatives encourage people to forgo private coverage paid from their own pockets in order to take advantage of care provided at taxpayer expense. Rational public policy would create the opposite incentives. At a minimum, government should be neutral — giving people just as much incentive to be in the private sector as in the public sector.

Book & Cinematic Reviews

Past Issue

Movie Review: By James J. Murtagh, M.D.

Warning: spoiler alert. If you have not seen the final episode of Breaking Bad, do not read further. The episode contains a major plot twist which is discussed in this Op- Ed.

It is fiendishly appropriate that the modern Greek tragedy, Breaking Bad, ends almost exactly 2400 years since Sophocles wrote Oedipus Rex. Breaking Bad, demonstrates the most intense hell on earth, forcing its worst characters to kill the people and things they love best. But unlike any other modern drama, the main character finds at least partial redemption in admitting, “I did it because I wanted to”- a completely novel idea in modern times!

For five years Breaking Bad, like The Shield, like the Sopranos, and “The Wire”, shows evil in all its seductive guises. Of these, Breaking Bad was most shocking, even moving its audience to cheer for the central character, Walter White, the average man in this morality play, the chemistry teacher dying of lung cancer who decided that he had no way out and had no choice but to turn to crime and cook meth. His almost-innocent beginning led to worse crimes and eventually he ends up a drug king pin. Even White’s murder of innocents- including an innocent child- evoked a morbid fascination. How much could one man get away with?

But then the twist. Tonight in the finale, Walt made no excuses. This may be a first since the Greeks and Shakespeare- Walt actually took responsibility and admits he has no one to blame but himself. He had been telling himself that he turned to crime to save his family. Tonight he admits, “I did it for myself. I liked cooking meth. I was good at it.”

Whoa! No one in the Inferno, or the Sopranos, or the Wire or the Shield, admitted that they had free will. Most, like Michael Corleone justified themselves, “I had to do it for my family.”

The average Shakespearean villains, from Richard III to Macbeth blamed the stars or the weather or the witches. Rarely did a villain admit “I am the author of my own suffering”. It was the highest form of Shakespearean art when characters transcended and admitted what they did- Hamlet, King Lear and Othello.

Oedipus was perhaps the first to realize his own free will brought him to his fate: “Apollo – he ordained my agonies – these, my pains on pains! But my hand that stuck my eyes was mine, mine alone – no one else – I did it all myself!”

In modern times, criminals blame a series of dominos in their life. Variations on the twinky defense. Lesser Greek characters also tried to blame crime on micro events- “we started the Trojan war because of an argument, a woman, an apple.”

Fate reserves circles in hell for treacherous murderers even below simple murderers. Not being caught appears infinitely crueler than being fried by 2,400 volts in an electric chair. There is a deep freeze as cold as great lake Cocytus Dante described at the bottom of the ninth circle of hell, reserved for the great traitors of all time.

Dostoevsky also believed that punishment was essential to redemption of the human soul. Hell’s best-kept secret is that we create it for ourselves. Walt connived, threatened, hoodwinked and betrayed to make a bad end. But Walt, at the last minute, realizes, makes sincere contrition and achieves a redemption.

Robert Frost wrote that torment by ice can be much more painful than by fire, metaphorically contrasting passionate torments with death by hatred. Walt’s enemy’s fate is death by ice, frozen into a bland cubicle, with no hope of redemption.

In a larger sense, society also had a hand in Walt’s demise. Had Obamacare been in place, and Walt had affordable health care, Walt would have had no reason to turn to crime. Is it worse for a hungry man to steal a loaf of bread, or a dying man to ask for medicine? Perhaps worst of all is the society that creates the criminal by making him steal the bread or the drug.

Congress should take note. Can Congress claim that it has no free will? I hope our lawmakers watched the program and decided to end the abomination of gridlock and the lack of medical care.

Shakespeare granted the release of death as the greatest boon to both homicidal heroes and villains. Hamlet, Oedipus and Walt all lived in worlds “rotten.” The deserts of New Mexico have much in common with Hamlet’s Denmark.

“To never have been born may be the greatest boon of all.” Walt had few options at the end. He asks his adversaries to end his life.

Not all villains could be punished by no punishment. The Iagos and Richard IIIs delight in escape. Could fitting punishment depend more on the nature of the criminal, than on the crime? For some criminals, capital punishment is devoutly to be wished. For Dante, divine punishment was necessary for the operation of a divine Universe.

Do we, in the modern world, including our leaders, suffer even more because the possibility of punishment often seems remote?

Sophocles heard it long ago upon the Agean, the turbid eb and flow of human misery.

Walt- your end puts you in the company of the greats. We will miss you.

Book & Cinematic Reviews

Past Issue

Love In The Time Of Algorithms

Online dating is awash with deviance. There are perverts, scammers and misanthropic entrepreneurs all hellbent on profiting from loneliness. But then there are women like Laura Brashier, a 37-year-old hairdresser from California and a survivor of cervical cancer. Her treatment left Ms Brashier unable to have sex. Rather than endure the anxiety of conventional dating she decided to set up a dating site for people like her. 2Date4Love describes itself as the site for “people who cannot engage in sexual intercourse to meet and experience love, companionship and intimacy at its deepest level. Since its creation in 2011, it has enrolled thousands of members who might otherwise have struggled to find romance.

Book & Cinematic Reviews

Past Issue

The American Conservatory Theatre: Dead Metaphor

Directed by Irene Lewis, this dark comedy–from one of Canada’s most acclaimed playwrights–satirizes the hypocrisies and politics of postwar living

A soldier returns from the Middle East to find work in this audacious and hilarious dark comedy

SAN FRANCISCO (January 15, 2013)—American Conservatory Theater (A.C.T.) welcomes the . . . world premiere of Dead Metaphor —George F. Walker’s dark comedy that satirizes the hypocrisies and politics of postwar living. When Dean returns home from the war in the Middle East and hits the job market, he discovers that his superior military skills don’t get him very far in the business world. His readjustment to non-bunker life begins by moving in with his aging parents and pregnant ex-(and soon-to-be current) wife. When he is offered a job as poster boy for a crusading politician on her own mission for “truth and justice,” his military ethics collide with the unscrupulous world of national political campaigns—and he discovers that his unique skill set may be his best asset after all.

Says A.C.T. Artistic Director Carey Perloff: “I read Dead Metaphor all in one sitting—the first scene made me laugh out loud, the second scene was a shocker, and by the third scene I was totally hooked. In the spirit of Bruce Norris’s Clybourne Park, George Walker has an incredible knack for mining dark humor out of impossible circumstances, deploying a kind of vivid satire to make us listen to our own clichés and become aware of our own hypocrisy. And I can think of no one better than Irene Lewis, who staged a brilliant production of David Mamet’s Race for us last season, to bring to life this world premiere by a major Canadian writer. A.C.T. audiences are in for an outrageous ride and a vivid glimpse at the underbelly of modern life and contemporary politics.”

Dan Rubin writes in the program about his conversation with the Playwright, George F. Walker:

In 1971, George F. Walker was a 23-year-old taxi driver from Toronto’s working-class East End. While carting fares around the city, he saw a Factory Theatre Lab poster calling for play submissions by Canadian Playwrights—part of founding artistic director Ken Gass’s visionary “Canadian Only” policy, one of the sparks of Toronto’s theater movement in the 1970s

Walker had been scribbling poems and short stories since high school. Friends from the neighborhood had always said he would become a writer Local writing groups were closed to a working-class kid, however. They were reserved for University of Toronto graduates. And Walker had no idea how approach publishers. Theater in Toronto, on the other hand, “was just getting started,” he remembers, “and they’d take anyone.”

So Walker wrote his first play, The Prince of Naples, and submitted it. A week later, he learned that it would receive a production. On the first day of rehearsal, Walker saw director Paul Bettis’ copy of the script. On it, dramaturg John Palme had written a note: “This guy is a genuine subversive. We’ve got to produce him.”

Where the title of Dead Metaphor came from, Walker explains. There used to be a time when we didn’t send soldiers off to fight wars and then forget entirely about them, like they weren’t even part of our society. Less than one percent of both our populations has anything to do with them. So something that used to mean something—soldiers fighting for their country—is now irrelevant. It is a dead thing. We don’t even know where they are. Off they go and then they come back into our world, many of them in trouble, messed up and with nowhere to go. They come back and they only get noticed when they’re in trouble. And we’re in trouble too.

A.C.T.’s 2012–13 season also features the world premiere music theater event Stuck Elevator (April 4–28), the Bay Area premiere of The National Theatre of Scotland’s internationally acclaimed production of Black Watch (May 9–June 9), and a new production of Tom Stoppard’s ravishing masterwork Arcadia (May 16–June 9).