In every rich country, it seems, people expect too much of their health-care systems. That is why, in their different ways, they are disappointed—and why they always will be. Citizens everywhere desire unrestricted access to state-of-the-art technologies. Increasingly, they insist on choice and control, too. Yet they are unwilling to pay what those things cost. People demand as a right the best health care money can buy, delivered in the way that best suits them, expense be damned. All that, and the price must be affordable.

Nowhere can this self-contradictory demand be satisfied. Everywhere, therefore, health care presents itself to governments as their most difficult nonsecurity challenge. In the United States, the costs are already staggering, and unless something changes, they will only get worse. Such is the sensitivity, though, that only the bravest or most reckless policy makers stride up to the issue with a genuine intention to act. Health care is a political death trap.

Consider this: the government increases its spending on Medicare by tens of billions of dollars a year (as the administration did with its recent prescription-drug reform) and the beneficiaries are up in arms about it. Yes, the execution of the scheme was botched. Still, where else could generosity on such a scale actually arouse hostility, to say nothing of its apparent failure to buy votes? When Americans are asked what they think about health care, most say they like the quality of service (the government must do nothing to compromise those high standards); they also complain that health care is far too costly (the government must act).

Wherever you look, you find no plainly superior system. Countless variants—from the mainly government-run, single-provider, single-payer model at one extreme to America's semi-private, multi-provider, multi-payer approach at the other—have been tried. None is widely popular. Canada, did somebody say? You must be joking. Rationing and gaps in coverage, necessary instruments of cost control in that system, are at the limit of what people will accept: they were an issue in the recent election, and helped get the previous government thrown out. Britain's National Health Service, once the country's pride, is today renowned for dirty hospitals that make you sicker than you were to begin with. (Private health insurance, with its higher costs and standards, is popular there; a two-tier system, one for haves and another for have-nots, quite at odds with the founding principles of the NHS, is firmly in place.)

Of course, the world's national health systems have more in common than you might think. Almost all of them are hybrids, mixing public and private. In statist Canada, some 30 percent of health spending is privately financed. In supposedly free-market America, well over 40 percent is taxpayer financed, and the privately financed part is intensely regulated—hardly a case of "leaving it to market forces." But the fact remains: many blueprints have been tried; all have drawbacks, and all leave users complaining about standards or costs, or both.

Here is a basic principle: if costs are to be better controlled, some medical services must be either forgone or denied. The key question: Who decides? Top-down rationing—as in Britain's health service—is one approach. Consumer-driven health care—where patients decide for themselves what they can afford—is another.

In the United States, where almost all privately financed health care is provided by employers through tax-sheltered insurance plans, neither of these curbing mechanisms is in place. There is no government monopoly, and, with employers paying for their insurance, most patients need not concern themselves much with the cost of their treatments. The result is that this country spends more per person on health care than any other—15 percent of the national income, compared with a rich-country average of 9 percent.

This enormous extra cost doubtless saves or significantly improves many lives. It supports a remarkable pace of innovation. And nothing is wrong in principle with a country spending proportionally more of its income on health care as it gets richer. But the system includes plenty of waste. It delivers services that cost more to provide than they would be worth to the patient if the patient were paying. And millions of Americans with low-paying jobs have no insurance: their employer does not provide it, they cannot afford to buy their own, and they are not poor enough to qualify for Medicaid. So America's health outcomes, in the aggregate, are only fair by international standards, and are not nearly as good as they should be, given what the country spends.

The administration and the economists it listens to want to control costs by mobilizing consumers. If the tax advantage for employer-provided insurance were removed or offset, and if more people bought their own policies, Americans would lean more toward plans with low premiums and plenty of cost-sharing (high deductibles and high co-payments). In this way, health insurance would be more like real insurance—a protection against serious financial risk—and less akin to a utility payment plan. Patient-consumers would have the incentive they lack at present to force costs down. The administration has proposed some reforms with this notion in mind.

This kind of approach draws two objections—one largely false (though widely advanced), the other valid. The false objection is that patients are too ignorant to be intelligent consumers of medical services. It is all too complicated, this argument goes. Necessary health expenditures would be cut as well as unnecessary ones. Some even question whether there is any such thing as an unnecessary health expenditure. It is not as though people go to the doctor for fun, they point out; people do it only when they have to. If you restrict access by directly confronting people with the costs, their health will suffer.

Well, such evidence as there is says that when patients have to pay a direct share of health-care costs, they do buy fewer medical services—but also that the effect of this on health outcomes is small. This was the principal finding of the RAND Health Insurance Experiment of the 1970s and early 1980s, still the largest health-policy study ever conducted in the United States. (Its findings are often quoted, but not always accurately.) One of the researchers, in a summary of the results on the RAND Web site, put it like this: "The additional care with free care may have had little marginal value besides relief of temporary anxiety and symptoms. In fact, free care led to more self-reported diseases and worry, especially among the initially well and rich ... There seems to be little [health] cost to increasing cost sharing within the range studied by the experiment and enormous potential savings."

Patients, given a reason, buy wisely. Is that so surprising? The truth is, buyers are at an informational disadvantage to sellers almost every time a deal—any deal—is struck. But they understand this, do their homework (if the transaction justifies the effort), and find ways of mitigating the problem, whether they are buying a car, a home, a college education, or an operation to remove a mole. Health-care professionals have a vested interest (15 percent of the gross domestic product) in insisting that health is special. In this respect it is not, or at any rate not as different as the argument implies. In all likelihood, making America's health-care market work better—more like other markets, in other words—would succeed in restraining costs, maybe quite significantly, without making health outcomes much worse. And people value the ability to make choices for themselves as a good thing in its own right.

People who can afford to, that is. The valid objection to consumer-guided health care—the issue that its advocates have to address, and that they are reluctant to—is not the "informational asymmetry" so emphasized by health-care economists but basic equity. Taxpayer-funded universal health-care systems are hugely redistributive. Many treatments, especially for chronic conditions, are simply unaffordable for people of modest means (but not poor enough, yet, to qualify for Medicaid). Low-cost private insurance is only a partial answer to this. Costs will continue to push premiums up, most likely faster than low incomes will rise. Also, many illnesses make such patients uninsurable—and insurers are getting better at identifying propensities to such illnesses before they strike.

In the United States, only the wealthy can be sure that an expensive health emergency or chronic illness will not ruin them financially. For people of ordinary means, "insurance" against that kind of catastrophe can be provided only by the state, with health care seen as part of a wider, more ambitious, and correspondingly more expensive social-welfare system. If things continue as they have been, the need to contain health-care outlays—by much-resented government order, express or implied, not through individual ability to pay—would of course remain. So here is the un-American alternative: much higher public spending; much higher taxes; lower standards of health care, most likely, for the insured majority; far more generous protection, on the other hand, for the uninsured and the unlucky.

The whole thing is political poison, of course, but as costs keep rising, something will eventually have to give. The present system is on course to be unaffordable and to let too many people down, and the closer you get to the single-payer socialized alternative, the less appealing it looks. Consumer-driven health care, supplemented with generous subsidies for those with low incomes, is at least worth a try.

Most Popular

Two hundred fifty years of slavery. Ninety years of Jim Crow. Sixty years of separate but equal. Thirty-five years of racist housing policy. Until we reckon with our compounding moral debts, America will never be whole.

And if thy brother, a Hebrew man, or a Hebrew woman, be sold unto thee, and serve thee six years; then in the seventh year thou shalt let him go free from thee. And when thou sendest him out free from thee, thou shalt not let him go away empty: thou shalt furnish him liberally out of thy flock, and out of thy floor, and out of thy winepress: of that wherewith the LORD thy God hath blessed thee thou shalt give unto him. And thou shalt remember that thou wast a bondman in the land of Egypt, and the LORD thy God redeemed thee: therefore I command thee this thing today.

— Deuteronomy 15: 12–15

Besides the crime which consists in violating the law, and varying from the right rule of reason, whereby a man so far becomes degenerate, and declares himself to quit the principles of human nature, and to be a noxious creature, there is commonly injury done to some person or other, and some other man receives damage by his transgression: in which case he who hath received any damage, has, besides the right of punishment common to him with other men, a particular right to seek reparation.

Writing used to be a solitary profession. How did it become so interminably social?

Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.

Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.

Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.

But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.

Most of the big names in futurism are men. What does that mean for the direction we’re all headed?

In the future, everyone’s going to have a robot assistant. That’s the story, at least. And as part of that long-running narrative, Facebook just launched its virtual assistant. They’re calling it Moneypenny—the secretary from the James Bond Films. Which means the symbol of our march forward, once again, ends up being a nod back. In this case, Moneypenny is a send-up to an age when Bond’s womanizing was a symbol of manliness and many women were, no matter what they wanted to be doing, secretaries.

Why can’t people imagine a future without falling into the sexist past? Why does the road ahead keep leading us back to a place that looks like the Tomorrowland of the 1950s? Well, when it comes to Moneypenny, here’s a relevant datapoint: More than two thirds of Facebook employees are men. That’s a ratio reflected among another key group: futurists.

Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.

MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.

Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.

During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.

During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.

Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.

The Wall Street Journal’s eyebrow-raising story of how the presidential candidate and her husband accepted cash from UBS without any regard for the appearance of impropriety that it created.

The Swiss bank UBS is one of the biggest, most powerful financial institutions in the world. As secretary of state, Hillary Clinton intervened to help it out with the IRS. And after that, the Swiss bank paid Bill Clinton $1.5 million for speaking gigs. TheWall Street Journal reported all that and more Thursday in an article that highlights huge conflicts of interest that the Clintons have created in the recent past.

The piece begins by detailing how Clinton helped the global bank.

“A few weeks after Hillary Clinton was sworn in as secretary of state in early 2009, she was summoned to Geneva by her Swiss counterpart to discuss an urgent matter. The Internal Revenue Service was suing UBS AG to get the identities of Americans with secret accounts,” the newspaper reports. “If the case proceeded, Switzerland’s largest bank would face an impossible choice: Violate Swiss secrecy laws by handing over the names, or refuse and face criminal charges in U.S. federal court. Within months, Mrs. Clinton announced a tentative legal settlement—an unusual intervention by the top U.S. diplomat. UBS ultimately turned over information on 4,450 accounts, a fraction of the 52,000 sought by the IRS.”

An attack on an American-funded military group epitomizes the Obama Administration’s logistical and strategic failures in the war-torn country.

Last week, the U.S. finally received some good news in Syria:.After months of prevarication, Turkey announced that the American military could launch airstrikes against Islamic State positions in Syria from its base in Incirlik. The development signaled that Turkey, a regional power, had at last agreed to join the fight against ISIS.

The announcement provided a dose of optimism in a conflict that has, in the last four years, killed over 200,000 and displaced millions more. Days later, however, the positive momentum screeched to a halt. Earlier this week, fighters from the al-Nusra Front, an Islamist group aligned with al-Qaeda, reportedly captured the commander of Division 30, a Syrian militia that receives U.S. funding and logistical support, in the countryside north of Aleppo. On Friday, the offensive escalated: Al-Nusra fighters attacked Division 30 headquarters, killing five and capturing others. According to Agence France Presse, the purpose of the attack was to obtain sophisticated weapons provided by the Americans.

Members of Colombia's younger generation say they “will not torture for tradition.”

MEDELLÍN, Colombia—On a scorching Saturday in February, hundreds of young men and women in Medellín stripped down to their swimsuit bottoms, slathered themselves in black and red paint, and sprawled out on the hot cement in Los Deseos Park in the north of the city. From my vantage point on the roof of a nearby building, the crowd of seminude protesters formed the shape of a bleeding bull—a vivid statement against the centuries-old culture of bullfighting in Colombia.

It wasn’t long ago that Colombia was among the world’s most important countries for bullfighting, due to the quality of its bulls and its large number of matadors. In his 1989 book Colombia: Tierra de Toros (“Colombia: Land of Bulls”), Alberto Lopera chronicled the maturation of the sport that Spanish conquistadors had introduced to South America in the 16th century, from its days as an unorganized brouhaha of bulls and booze in colonial plazas to a more traditional Spanish-style spectacle whose fans filled bullfighting rings across the country.

Some say the so-called sharing economy has gotten away from its central premise—sharing.

This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.

The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”