In an unabashed endorsement of government action to alleviate the plight of the poor, this week President Obama commemorated the 50th anniversary of the War on Poverty with his own call for new policies to address the continued struggles of tens of millions of Americans.

In his official statement, Obama remarked that, “In the richest nation on earth, far too many children are still born into poverty, far too few have a fair shot to escape it, and Americans of all races and backgrounds experience wages and incomes that aren’t rising… That does not mean… abandoning the War on Poverty. In fact, if we hadn’t declared ‘unconditional war on poverty in America,’ millions more Americans would be living in poverty today. Instead, it means we must redouble our efforts to make sure our economy works for every working American. “

It would seem hard to argue with such sentiments, yet some have done so. Fox News published a piece saying “despite trillions spent, poverty won.” Many others react by shaking their heads sadly, acknowledging the noble effort and concluding that it was an abject failure. The implication is clear: government spent a mint and did not end poverty, and now Obama is calling for more of the same.

This raises two crucial questions: did the first “war” really fail? And what should we do today?

As for the first, when Lyndon Johnson called for an end to poverty on January 8, 1964, he continued the tradition of the New Deal and decades of American policy designed to provide all Americans with basic standards of living — housing, education, healthcare and jobs. Americans believed that an activist government could achieve those goals, hence the trillions of dollars directed at the War on Poverty.

Those trillions have over time reduced the official “poverty rate” from 19 percent to 15 percent. Many have concluded that such a minor shift wasn’t worth the massive expense. Johnson’s legacy was tarnished by the chaos unleashed by opposition to the Vietnam War and by the morass of the 1970s, and the Reagan revolution of the 1980s was predicated in part on aconviction that the government’s attempt to alleviate the plight of the poor was not only social engineering, but badly-done social engineering.

Yet poverty today is of a different order than poverty 50 or 100 years ago. During the Great Depression, millions of Americans were still without electricity or running water. By the 1960s that had changed, but many people still lacked basic healthcare, and the elderly were often at the mercy of their families. Today, there is still widespread poverty as defined by official income statistics, but the conditions of poverty are materially different, as Jordan Weissmann at TheAtlantic has shown.

In part, that is because of the safety net we have since created. Many conservatives believe that we were better off in a world where private charity groups and religious organizations provided assistance, rather than government programs such as food stamps, welfare, unemployment benefits, Social Security and disability payments. But while that world did place much greater stock in self-reliance, it also left far more people at a huge disadvantage, struggling for life’s basic necessities. You could — and some do — argue that such a world produced heartier souls more able to cope with life’s vicissitudes. You could also argue—and should—that such a world was harsh and destructive to many in ways that humans for centuries have strived to ameliorate.

Today we have a massive social safety net, thanks to both the New Deal and the substantial expansion of federal and state programs beginning in the 1960s. These programs soon included housing as well. Many have seen more waste than not, and housing programs in particular did not fare well, as the scarred urban landscape of housing projects demonstrates.

But that safety net—much of which is not well-captured in the per capita income statistics that are used to assess the poverty rate—did create a set of expectations about the minimum level of necessities that all Americans deserve. That minimum—consisting of adequate shelter, food, heat and air conditioning, public education, and access to healthcare for the elderly—is a reality today.

The real critique, however, and the area we should focus on in the years ahead, is that because Americans are divided about this safety net, we accomplish two things, neither of which are optimal. We spend trillions on programs designed to provide some level of basic security, and yet these programs remain controversial. Significant opposition to these programs and the constant threat that they could be cut means that instead of providing security, they create insecurity, and because of that opposition, it becomes almost impossible to discuss how they could be improved, rather than maintained or terminated.

The result is something of a worst of all possible worlds: We maintain a vast safety net while pretending that we do not, and many of us act as if safety nets are at best ineffective and at worst immoral. The net result is that as a society, we find ourselves unable to enact needed reforms.

The answer, then, is to recognize that in securing many basic necessities, the War on Poverty succeeded, either in actually ensuring that those necessities exist, or in establishing that having them is a fundamental right. Even the most virulent opponents to social safety net programs accept that right, which would not have been the case well into the 20th century. The programs may not have altered the poverty rate, but in part that’s because we have constantly reset and raised the bar about what we consider to be the most basic resources that every American deserves. Our “enough” today is considerably greater than it was fifty years ago.

The next solutions to the challenges of today’s poverty, therefore, are not better public housing and Medicaid. We do not need the same approach that various administrations have been advocating for the past 50 years. We need instead a consensus about what we believe are the next level of basic material rights of every citizen—beyond food, clothing and shelter. Many of those—such as self-esteem, the tools to build careers, the ability to navigate a world defined by information rather than manufacturing—are within the ability of government to provide.

State and local governments have been laboratories of new initiatives—from work and training programs, to partnerships between local businesses and community colleges, to food banks. Thankfully, such initiatives at all levels of government require less money than more traditional social services. They also demand more flexibility. Government programs defined not by ideology but by flexibility and the ability to help private and local institutions act—not by giving them grants as the War on Poverty did, but via tax incentives that help run programs—that would be welcome innovation, and the best way to continue the legacy of the War on Poverty. And with the federal government unlikely to spend more in today’s climate, it may also be the only way.

Most Popular

Writing used to be a solitary profession. How did it become so interminably social?

Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.

Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.

Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.

But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.

Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.

MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.

Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.

The Wall Street Journal’s eyebrow-raising story of how the presidential candidate and her husband accepted cash from UBS without any regard for the appearance of impropriety that it created.

The Swiss bank UBS is one of the biggest, most powerful financial institutions in the world. As secretary of state, Hillary Clinton intervened to help it out with the IRS. And after that, the Swiss bank paid Bill Clinton $1.5 million for speaking gigs. TheWall Street Journal reported all that and more Thursday in an article that highlights huge conflicts of interest that the Clintons have created in the recent past.

The piece begins by detailing how Clinton helped the global bank.

“A few weeks after Hillary Clinton was sworn in as secretary of state in early 2009, she was summoned to Geneva by her Swiss counterpart to discuss an urgent matter. The Internal Revenue Service was suing UBS AG to get the identities of Americans with secret accounts,” the newspaper reports. “If the case proceeded, Switzerland’s largest bank would face an impossible choice: Violate Swiss secrecy laws by handing over the names, or refuse and face criminal charges in U.S. federal court. Within months, Mrs. Clinton announced a tentative legal settlement—an unusual intervention by the top U.S. diplomat. UBS ultimately turned over information on 4,450 accounts, a fraction of the 52,000 sought by the IRS.”

There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!

And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.

He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.

During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.

During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.

Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.

An attack on an American-funded military group epitomizes the Obama Administration’s logistical and strategic failures in the war-torn country.

Last week, the U.S. finally received some good news in Syria:.After months of prevarication, Turkey announced that the American military could launch airstrikes against Islamic State positions in Syria from its base in Incirlik. The development signaled that Turkey, a regional power, had at last agreed to join the fight against ISIS.

The announcement provided a dose of optimism in a conflict that has, in the last four years, killed over 200,000 and displaced millions more. Days later, however, the positive momentum screeched to a halt. Earlier this week, fighters from the al-Nusra Front, an Islamist group aligned with al-Qaeda, reportedly captured the commander of Division 30, a Syrian militia that receives U.S. funding and logistical support, in the countryside north of Aleppo. On Friday, the offensive escalated: Al-Nusra fighters attacked Division 30 headquarters, killing five and capturing others. According to Agence France Presse, the purpose of the attack was to obtain sophisticated weapons provided by the Americans.

The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.

What is the Islamic State?

Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.

Some say the so-called sharing economy has gotten away from its central premise—sharing.

This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.

The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”

A controversial treatment shows promise, especially for victims of trauma.

It’s straight out of a cartoon about hypnosis: A black-cloaked charlatan swings a pendulum in front of a patient, who dutifully watches and ping-pongs his eyes in turn. (This might be chased with the intonation, “You are getting sleeeeeepy...”)

Unlike most stereotypical images of mind alteration—“Psychiatric help, 5 cents” anyone?—this one is real. An obscure type of therapy known as EMDR, or Eye Movement Desensitization and Reprocessing, is gaining ground as a potential treatment for people who have experienced severe forms of trauma.

Here’s the idea: The person is told to focus on the troubling image or negative thought while simultaneously moving his or her eyes back and forth. To prompt this, the therapist might move his fingers from side to side, or he might use a tapping or waving of a wand. The patient is told to let her mind go blank and notice whatever sensations might come to mind. These steps are repeated throughout the session.