French women supposedly don’t get fat, and in the minds of many Americans, they also don’t get stuck with très gros medical bills. There’s long been a dream among some American progressives to truly live as the “Europeans1” do and have single-payer health care.

Republicans’ failure—so far—to repeal and replace Obamacare has breathed new life into the single-payer dream. In June, the majority of Americans told Pew that the government has the responsibility to ensure health coverage for everyone, and 33 percent say this should take the form of a single government program. The majority of Democrats, in that poll, supported single payer. A June poll from the Kaiser Family Foundation even found that a slim majority of all Americans favor single payer.

Pew

Liberal politicians are hearing them loud and clear. Vermont Senator Bernie Sanders reportedly plans to introduce a single-payer bill once Congress comes back from recess—even though no Senate Democrats voted for a single-payer amendment last month. Massachusetts Senator Elizabeth Warren has also said “the next step is single payer” when it comes to the Democrats’ health-care ambitions.

But should it be? It’s true that the current American health-care system suffers from serious problems. It’s too expensive, millions are still uninsured, and even insured people sometimes can’t afford to go to the doctor.

Single payer might be one way to fix that. But it could also bring with it some downsides—especially in the early years—that Americans who support the idea might not be fully aware of. And they are potentially big downsides.

First, it’s important to define what we mean by “single payer.” It could mean total socialized medicine, in that medical care is financed by—and doctors work for—the federal government. But there are also shades of gray, like a “Medicaid for all” system, where a single, national insurance program is available to all Americans, but care is rationed somewhat—not every drug and device is covered, and you have to jump through hoops to get experimental or pricier treatments. Or it could be “Medicare for all,” in which there’s still a single, national plan, but it’s more like an all-you-can-eat buffet. Like Medicare, this type of single-payer system would strain the federal budget, but it wouldn’t restrict the treatments people can get. Because it’s the term most often used in single-payer discussions, I’ll use that here.

The biggest problem with Medicare for all, according to Bob Laszewski, an insurance-industry analyst, is that Medicare pays doctors and hospitals substantially less than employer-based plans do.

“Now, call a hospital administrator and tell him that his reimbursement for all the employer-based insurance he gets now is going to be cut by 50 percent, and ask him what’s going to happen,” he said. “I think you can imagine—he’d go broke.” (As it happens, the American Hospital Association did not return a request for comment.)

The reason other countries have functional single-payer systems and we don’t, he says, is that they created them decades ago. Strict government controls have kept their health-care costs low since then, while we’ve allowed generous private insurance plans to drive up our health-care costs. The United Kingdom can insure everyone for relatively cheap because British providers just don’t charge as much for drugs and procedures.

Laszewski compares trying to rein in health-care costs by dramatically cutting payment rates to seeing a truck going 75 miles an hour suddenly slam on the brakes. The first 10 to 20 years after single payer, he predicts, “would be ugly as hell.” Hospitals would shut down, and waits for major procedures would extend from a few weeks to several months.

Craig Garthwaite, a professor at the Kellogg School of Management at Northwestern University, says “we would see a degradation in the customer-service side of health care.” People might have to wait longer to see a specialist, for example. He describes the luxurious-sounding hospital where his kids were born, a beautiful place with art in the lobby and private rooms. “That’s not what a single-payer hospital is going to look like,” he said. “But I think my kid could have been just as healthily born without wood paneling, probably.”

He cautions people to think about both the costs and benefits of single payer; it’s not a panacea. “There aren’t going to be free $100 bills on the sidewalk if we move to single payer,” he said.

He also predicts that, if single payer did bring drug costs down, there might be less venture-capital money chasing drug development, which might mean fewer blockbuster cures down the line. And yes, he added, “you would lose some hospitals for sure.”

Amitabh Chandra, the director of health-policy research at Harvard University, doesn’t think it would be so bad if hospitals shut down—as long as they’re little-used, underperforming hospitals. Things like telemedicine or ambulatory surgical centers might replace hospital stays, he suspects. And longer waits might not, from an economist’s perspective, be the worst thing, either. That would be a way of rationing care, and we’re going to desperately need some sort of rationing. Otherwise “Medicare for all” would be very expensive and would probably necessitate a large tax increase. (A few years ago, Vermont’s plan for single payer fell apart because it was too costly.)

If the United States decided not to go that route, Chandra says, we would be looking at something more like “Medicaid for all.” Medicaid, the health-insurance program for the poor, is a much leaner program than Medicare. Not all doctors take it, and it limits the drugs and treatments its beneficiaries can get. This could work, in Chandra’s view, but many Americans would find it stingy compared to their employers’ ultra-luxe PPO plans. “Americans would say, ‘I like my super-generous, employer-provided insurance. Why did you take it away from me?’” he said.

Indeed, that’s the real hurdle to setting up single payer, says Tim Jost, emeritus professor at the Washington and Lee University School of Law. Between “80 to 85 percent of Americans are already covered by health insurance, and most of them are happy with what they’ve got.” It’s true that single payer would help extend coverage to those who are currently uninsured. But policy makers could already do that by simply expanding Medicaid or providing larger subsidies to low-income Americans.

Under single payer, employers would stop covering part of their employees’ insurance premiums, as they do now, and people would likely see their taxes rise. “As people started to see it, they would get scared,” Jost said. And that’s before you factor in how negatively Republican groups would likely paint single payer in TV ads and Congressional hearings. (Remember death panels?) It would just be a very hard sell to the American public.

“As someone who is very supportive of the Democratic party,” Jost said, “I hope the Democrats don’t decide to jump off the cliff of embracing single payer.”

About the Author

Most Popular

The revolutionary ideals of Black Panther’s profound and complex villain have been twisted into a desire for hegemony.

The following article contains major spoilers.

Black Panther is a love letter to people of African descent all over the world. Its actors, its costume design, its music, and countless other facets of the film are drawn from all over the continent and its diaspora, in a science-fiction celebration of the imaginary country of Wakanda, a high-tech utopia that is a fictive manifestation of African potential unfettered by slavery and colonialism.

But it is first and foremost an African American love letter, and as such it is consumed with The Void, the psychic and cultural wound caused by the Trans-Atlantic slave trade, the loss of life, culture, language, and history that could never be restored. It is the attempt to penetrate The Void that brought us Alex Haley’s Roots, that draws thousands of African Americans across the ocean to visit West Africa every year, that left me crumpled on the rocks outside the Door of No Return at Gorée Island’s slave house as I stared out over a horizon that my ancestors might have traversed once and forever. Because all they have was lost to The Void, I can never know who they were, and neither can anyone else.

In Cyprus, Estonia, the United Arab Emirates, and elsewhere, passports can now be bought and sold.

“If you believe you are a citizen of the world, you are a citizen of nowhere. You don’t understand what citizenship means,” the British prime minister, Theresa May, declared in October 2016. Not long after, at his first postelection rally, Donald Trump asserted, “There is no global anthem. No global currency. No certificate of global citizenship. We pledge allegiance to one flag and that flag is the American flag.” And in Hungary, Prime Minister Viktor Orbán has increased his national-conservative party’s popularity with statements like “all the terrorists are basically migrants” and “the best migrant is the migrant who does not come.”

Citizenship and its varying legal definition has become one of the key battlegrounds of the 21st century, as nations attempt to stake out their power in a G-Zero, globalized world, one increasingly defined by transnational, borderless trade and liquid, virtual finance. In a climate of pervasive nationalism, jingoism, xenophobia, and ever-building resentment toward those who move, it’s tempting to think that doing so would become more difficult. But alongside the rise of populist, identitarian movements across the globe, identity itself is being virtualized, too. It no longer needs to be tied to place or nation to function in the global marketplace.

Deputy Attorney General Ron Rosenstein flew to Seattle for a press conference at which he announced little, but may have said a great deal.

Back in the fall of 2001, exactly one month after the 9/11 attacks, a lawyer in Seattle named Tom Wales was murdered as he worked alone at his home computer at night. Someone walked into the yard of Wales’s house in the Queen Anne Hill neighborhood of Seattle, careful to avoid sensors that would have set off flood lights in the yard, and fired several times through a basement window, hitting Wales as he sat at his desk. Wales survived long enough to make a call to 911 and died soon afterwards. He was 49, divorced, with two children in their 20s.

The crime was huge and dismaying news in Seattle, where Wales was a prominent, respected, and widely liked figure. As a young lawyer in the early 1980s he had left a potentially lucrative path with a New York law firm to come to Seattle and work as an assistant U.S. attorney, or federal prosecutor. That role, which he was still performing at the time of his death, mainly involved prosecuting fraud cases. In his off-duty hours, Wales had become a prominent gun-control advocate. From the time of his death onward, the circumstances of the killing—deliberate, planned, nothing like a robbery or a random tragedy—and the prominence of his official crime-fighting record and unofficial advocacy role led to widespread assumption that his death was a retaliatory “hit.” The Justice Department considers him the first and only U.S. prosecutor to have been killed in the line of duty.

A week after 17 people were murdered in a mass shooting at Marjory Stoneman Douglas High School in Parkland, Florida, teenagers across South Florida, in areas near Washington, D.C., and in other parts of the United States walked out of their classrooms to stage protests against the horror of school shootings and to advocate for gun law reforms.

A week after 17 people were murdered in a mass shooting at Marjory Stoneman Douglas High School in Parkland, Florida, teenagers across South Florida, in areas near Washington, D.C., and in other parts of the United States walked out of their classrooms to stage protests against the horror of school shootings and to advocate for gun law reforms. Student survivors of the attack at Marjory Stoneman Douglas High School traveled to their state Capitol to attend a rally, meet with legislators, and urge them to do anything they can to make their lives safer. These teenagers are speaking clearly for themselves on social media, speaking loudly to the media, and they are speaking straight to those in power—challenging lawmakers to end the bloodshed with their “#NeverAgain” movement.

The president’s son is selling luxury condos and making a foreign-policy speech.

Who does Donald Trump Jr. speak for?

Does the president’s son speak for the Trump Organization as he promotes luxury apartments in India? Does he speak for himself when he dines with investors in the projects? Does he speak for the Trump administration as he makes a foreign-policy speech in Mumbai on Friday?

“When these sons go around all over the world talking about, one, Trump business deals and, two, … apparently giving speeches on some United States government foreign policy, they are strongly suggesting a linkage between the two,” Richard Painter, President George W. Bush’s chief ethics lawyer who is a professor of law at the University of Minnesota, told me. “Somebody, somewhere is going to cross the line into suggesting a quid pro quo.”

On Tuesday, the district attorney in Durham, North Carolina, dismissed all remaining charges in the August case. What does that mean for the future of statues around the country?

DURHAM, N.C.—“Let me be clear, no one is getting away with what happened.”

That was Durham County Sheriff Mike Andrews’s warning on August 15, 2017. The day before, a protest had formed on the lawn outside the county offices in an old courthouse. In more or less broad daylight, some demonstrators had leaned a ladder against the plinth, reading, “In memory of the boys who wore the gray,” and looped a strap around it. Then the crowd pulled down the statue, and it crumpled cheaply on the grass. It was a brazen act, witnessed by dozens of people, some of them filming on cell phones.

Andrews was wrong. On Tuesday, a day after a judge dismissed charges against two defendants and acquitted a third, Durham County District Attorney Roger Echols announced the state was in effect surrendering, dismissing charges against six other defendants.

The preacher, dead at 99, advised presidents, mentored clergy, and influenced millions of people. Will his legacy of non-partisan outreach continue?

Billy Graham, the famous preacher who reached millions of people around the world through his Christian ministry, died on Wednesday at 99. Over the course of more than six decades, he reshaped the landscape of evangelism, sharing the gospel from North Carolina to North Korea and developing innovative ways to communicate the message of the Bible. He influenced generations of pastors and developed friendships with presidents, prime ministers, and royalty around the world. His death marks the end of an era for evangelicalism, and poses a fundamental question: Will his legacy of bipartisan, ecumenical outreach be carried forward?

Graham came up as a preacher during the post-war era, a time when American Christianity was being radically remade. “When Billy came on the scene, fundamentalism, as it’s called, was really prevalent,” said Greg Laurie, the pastor of the California megachurch Harvest Christian Fellowship and member of the board of the Billy Graham Evangelistic Association, in an interview. “Billy wanted to broaden the base and reach more people.”

A new study finds that many household goods degrade air quality more than once thought.

On the final day of April 2010, unbeknownst to most locals, a small fleet of specialists and equipment from the U.S. government descended on the seas and skies around Los Angeles.

A “Hurricane Hunter” Lockheed P-3 flew in from Denver. The U.S. Navy vessel Atlantis loitered off the coast of Santa Monica. Orbiting satellites took special measurements. And dozens of scientists set up temporary labs across the basin, in empty Pasadena parking lots and at the peak of Mount Wilson.

This was all part of a massive U.S. government study with an ambitious goal: Measure every type of gas or chemical that wafted by in the California air.

Jessica Gilman, a research chemist at the National Oceanic and Atmospheric Administration, was one member of the invading horde. For six weeks, she monitored one piece of equipment—a kind of “souped-up, ruggedized” instrument—as it sat outside in Pasadena, churning through day and night, measuring the amount of chemicals in the air. It was designed to detect one type of air pollutant in particular: volatile organic compounds, or VOCs. VOCs are best known for their presence in car exhaust, but they are also found in gases released by common household products, like cleaners, house paints, and nail polish.

The path to its revival lies in self-sacrifice, and in placing collective interests ahead of the narrowly personal.

The death of liberalism constitutes the publishing world’s biggest mass funeral since the death of God half a century ago. Some authors, like conservative philosopher Patrick Deneen, of Why Liberalism Failed, have come to bury yesterday’s dogma. Others, like Edward Luce (The Retreat of Western Liberalism), Mark Lilla (The Once and Future Liberal), and Steven Levitsky and Daniel Ziblatt (How Democracies Die) come rather to praise. I’m in the latter group; the title-in-my-head of the book I’m now writing is What Was Liberalism.

But perhaps, like God, liberalism has been buried prematurely. Maybe the question that we should be asking is not what killed liberalism, but rather, what can we learn from liberalism’s long story of persistence—and how can we apply those insights in order to help liberalism write a new story for our own time.