Forty years after Kermit the Frog first sang his blues, is it finally easy bein’ green? Hybrid vehicles like the Prius offer better gas mileage without sacrificing size or comfort, while electric vehicles promise to transport us with no gas at all. Every type of company is jockeying to get the most-stringent green certification for their plants. Windmills are no longer the stuff of quaint Dutch paintings or environmentalist fantasies; they’re sprouting on farms, on mountain ridges, even on the ocean. President Obama seems to launch a new energy initiative every week, always promising more green jobs to offset any temporary pain in the pocketbook. To hear him tell it, pretty soon we’ll all be so darn green that Kermit will blend right in.

But while these green alternatives may now appear ubiquitous, they’re not actually as common as we think. Take electricity. In 2010, one-tenth of our electricity came from renewable sources. But most of that was hydroelectric power, not wind or solar—and hydroelectric output has actually dropped by almost a third since 1997. That fall has more than offset the rise of wind power, meaning we now generate less electricity from renewables than we did in 1997.

Nuclear generation has risen, making our electricity output slightly less carbon-intensive than back then. But whether it will continue to rise in the wake of Japan’s nuclear disaster remains to be seen.

Green technology, especially in automobiles, may get a big boost from higher fossil-fuel prices. That’s the good news. The bad news is that those higher prices result from higher demand in the developing world. When we consume less oil, we may not be slowing the rate of fossil-fuel consumption; we may simply be transferring that consumption somewhere else.

Unless we somehow stop burning fossil fuels, all the carbon currently under the Earth’s surface will end up in the atmosphere in the next few hundred years. And as the physicist Robert B. Laughlin recently pointed out in The American Scholar, from the Earth’s point of view, a few hundred years is less than the blink of an eye. Even if we burn fossil fuels at a slower pace, temperatures will still rise, the oceans will still acidify, human lives will be much altered.

Unfortunately, although we have better and better technologies that enable us to use less fossil fuel, we have no scalable way to use none, or anything close to none. Even rapidly maturing technologies like wind power require carbon-intensive backup-generation capacity, for those times when the wind doesn’t blow. And no one has yet designed a hybrid commercial airplane. Being really green, we’re finding out, is even harder than it sounds.

13. The Maniac Will Be Televised

Walter Kirn Author of Up in the Air and Lost in the Meritocracy

In a news year dominated by manic ranters, from Charlie Sheen to Donald Trump to the Rent Is Too Damn High guy (and even, on the extreme end, Colonel Qaddafi), we are quickly learning that agitation pays when it comes to maintaining a high profile in our seething media environment. If the old advice to electronic communicators was to speak in sound bites and keep things simple, to cut through the noise by being straightforward and countering confusion with consistency, the new winning strategy is the opposite: embrace incoherence and become the noise. The cool self-control that was once considered the soul of telegenic behavior has been turned inside out, and the traits that people used to suppress when they appeared on television—the contortions and tics—are now the best way to engage an audience. Attention-deficit disorder, remember, responds to stimulants, not sedatives.

Sheen was the spilled beaker in the laboratory who proved that in an age of racing connectivity, a cokehead can be a calming presence. His branching, dopamine-flooded neural pathways mirrored those of the Internet itself, and his lips moved at the speed of a Cisco router, creating a perfect merger of form and function. Trump, though his affect is slower and less sloppy, also showed mastery of the Networked Now by speaking chiefly in paranoid innuendo. The Web, after all, is not a web of truths; its very infrastructure is gossip-shaped. The genius of Sheen and Trump and other mediapaths (Michele Bachmann belongs on this list too) is that they seem to understand, intuitively, that the electronic brain of the new media has an affinity for suspicious minds.

12. The Players Own the Game

Will LeitchColumnist, New York magazine

When LeBron James was drafted by the Cleveland Cavaliers in June 2003, he was wearing a blindingly white suit and an awkward, bewildered smile. Talking to ESPN’s Michele Tafoya, seconds after being drafted, LeBron said, “[This] shows the hard work has finally paid off for me.” He was 18 years old.

Seven years later, LeBron was back on ESPN, announcing to the sports world that he would be leaving the Cleveland Cavaliers and taking his “talents to South Beach.” In those seven years, LeBron had won two MVP awards, four first-team all-NBA awards, and, famously, zero titles. But perhaps the most important thing he won was his freedom. LeBron joined the Miami Heat not because the move would bring him the most money, or the best chance at a championship. He went to Miami because his friends Dwyane Wade and Chris Bosh would be there. He went because it’s sunny, and the women are very attractive. He went because he wanted to.

LeBron has been derided less for the choice he made than for the way he announced it—on television, surrounded by children, hawking a flavored water. But what really scared the world of sports was that LeBron’s hubris was justified. No one cared about the Cavs or the Heat; they were interested in LeBron.

The world of sports is able to exist because it treats its labor unlike any other business on Earth. If you are an accountant, a librarian, a car salesman, whatever, when you receive an offer from anyone in the world for your services, you are able to take it. You can work anywhere, for whatever wage you’re able to grab. If this happened in sports, the result would be chaos: every team’s roster would turn over every year, and all the talent would be concentrated on two or three teams (even more than it already is). So much of a sport’s appeal is in the illusion of team history and continuity; unbridled free agency would destroy that illusion. For all the talk of supposed “rich and spoiled athletes,” few other industries can get away with labor practices that essentially amount to high-paid indentured servitude for the players.

LeBron’s example marks an evolution in athlete culture, one in which players realize their power. You’re seeing this everywhere now, from the NFL and NBA labor battles to the better understanding of concussions and athlete safety. For their part, fans are better educated than they’ve ever been (thanks to the Web) and are starting to side with the players in kerfuffles like labor disputes. Fans used to feel that owners somehow “earned” their money, while pro athletes were just fortunate winners of a genetic lottery. This is the exact opposite of the truth. (Holding on to your job is about 95 million times harder for a player than for an owner.) Sure, guys like LeBron and Carmelo Anthony are seen as mercenaries, but from a business standpoint, we understand their leverage ... and even appreciate and envy it.

Owners might not realize where this is headed, but as the players make more money than ever in outside endorsement deals, their dependence on the leagues is waning. The athletes control these businesses—it’s the players’ jerseys we’re wearing, not the owners’.

Then again, that could change. LeBron just bought a stake in Liverpool FC, one of the more popular English soccer teams. In this way, he can, at last, be like Mike: in 2010, Michael Jordan bought a majority share of the Charlotte Bobcats. The players are becoming the owners now. This is just the beginning.

11. Gay Is the New Normal

Jonathan Rauch The Atlantic

Perhaps this had to happen: the straight-rights movement is here. No, it does not call itself that. (Yet.) But opponents of same-sex marriage, and others who are unfriendly to the gay-rights movement, have adopted the posture of a victim group. They are, it seems ... an oppressed majority.

The backstory is this: Until recently, and for as long as pollsters at Gallup have thought to ask, a clear majority of Americans regarded homosexual relations as morally wrong. The entire superstructure of anti-gay sentiment and policy stood upon that foundation of opprobrium. In 2008, however, the lines crossed, with as many Americans (48 percent) telling Gallup that gay and lesbian relations are “morally acceptable” as said they are “morally wrong.” And in 2010, for the first time, an outright majority, 52 percent, called homosexuality morally acceptable, with only 43 percent condemning it. From here, the level of opprobrium is likely only to shrink.

This change is a watershed in gay-straight relations, and it brings a disorienting political role reversal. It is the condemnation of homosexuality, rather than homosexuality itself, that will be increasingly stigmatized as morally deviant. And it is the opponents of gay equality who will insist they are the oppressed group, the true victims of civil-rights violations. Indeed, they have already developed, and are vigorously marketing, a “gay bullies” narrative:

Confronted with a poll showing that a majority of Americans support gay marriage, Maggie Gallagher, a leading gay-marriage opponent, responds that the poll reflects not true public sentiment, but gay activists’ success at “intimidating and silencing.”

Describing a dispute involving a bakery that refused to make rainbow-colored cupcakes for a pro-gay student group, David E. Smith, the executive director of the Illinois Family Institute, writes: “Homosexual bullies and their heterosexual accomplices are now speciously attempting to turn this into an issue of ‘discrimination.’”

Tony Perkins, the president of the Family Research Council, says that if California’s ban on gay marriage is not upheld, “we’ll have gone, in one generation, from 1962, when the Bible was banned in public schools, to religious beliefs being banned in America.”

In a country where evangelicals outnumber self-identified gays by at least 10 to 1, and where anti-gay bullying is endemic in schools, and where same-sex couples cannot marry in 45 states, and where countless gay Americans cannot even get their foreign partners into the country, much less into a hospital room—here, we’re supposed to believe that gays are the bullies? Get used to it. This is the script of culture wars to come.

Contributing editor,

10. Bonds Are Dead (Long Live Bonds)

Clive CrookThe Atlantic

Investors in U.S. government bonds have had a fabulous run, and it’s over. For more than a decade before the Great Recession began, a surge of global saving increased the demand for Treasury bonds and raised their prices, delivering handsome capital gains. When the economy tanked, the government had to sell its bonds even faster to pay for its stimulus—and the price of its debt kept rising anyway. Investors saw U.S. bonds as a safe asset and demanded them all the more; the Federal Reserve started buying them too, in massive quantities, to keep interest rates low. So Treasury bonds delivered income and capital appreciation rivaling the historic return on equities—a much riskier asset.

It couldn’t last, and it hasn’t. By this spring, long-term interest rates had fallen so far that they had only one way to go. The exact opposite was true for bond prices. Whether you call this “the end of bonds,” as some market-watchers do, depends on your tolerance for hyperbole. Bonds aren’t going away. Balancing the budget will take time. Even the most zealous deficit-cutters foresee heavy borrowing for years to come. The government will have to keep selling its debt. The question is, how cheap will bonds have to be to persuade private lenders to buy—and will the Fed be willing, if necessary, to remain an investor itself?

For now, capital has no other safe haven, so private investors will think hard before shunning the market, and will likely settle for a moderate increase in yields. Certainly, the Fed would love to get out. But the U.S. is in the nice position of borrowing in its own currency—Greece and Portugal should be so lucky—so its central bank can always fund the government by “printing money” and buying the bonds itself. The more it does this, the greater the risk of high inflation later. Interest rates could soar in the meantime too, and that would depress physical investment—that is, spending on things like factories, machines, and roads. But the option of buying its own bonds is there, and having it is better than not having it. One way or another, this is not “the end of bonds.”

Senior editor,

9. The Next War Will Be Digitized

James FallowsNational Correspondent, The Atlantic

Sometimes America has worried primarily about external threats. Sometimes, about the enemy within. The attempts to detect and suppress internal dangers generally look bad in retrospect, because they so often come at the cost of the liberties, absorbency, and flexibility that are America’s distinctive strengths. The Alien and Sedition Acts in the new republic’s first decades, the “Red scares” after both World Wars, the propaganda office Woodrow Wilson set up during the First World War, and the Japanese American internment program FDR approved in the Second—these illustrate how much more complicated it is for a democracy to deal with unseen inside threats than to confront enemies on a battlefield. Through the past decade of the “global war on terror,” the United States has faced a new version of this old challenge of protecting itself without destroying or perverting its essential nature.

That challenge is already taking on another and even more complicated form. The biggest change in human interactions in the past generation is the rising importance of “the cloud”—the electronic networks that let us witness disaster or upheaval wherever it happens, connect with friends wherever they are, get a map or see a satellite photo of virtually any point on Earth, and coordinate business, financial, scientific, and educational efforts across the globe all at once. Of course, the indispensability of these systems creates their danger. If the factories, the banks, the hospitals, and the electric and water systems must all be online to function, they are all, in principle, vulnerable to electronic attack.

With last summer’s discovery of the insidious Stuxnet virus, we know—or “know,” since neither the Israeli nor the U.S. government, nor any other, will come out and say that it developed malicious software to disable Iran’s nuclear-weapons program—that this threat is more than hypothetical. We also know that it can be posed by states, as the latest form of war, and not just by bands of scammers trying to steal your credit-card numbers or make you wire money to Nigeria. It is a potential external menace as hard to detect as an internal one, and very hard to control without limiting the fast, open connectivity that gives networks their value.

Grand-scale geostrategy has always involved locating the opponent’s choke points and vulnerabilities, where concentrated damage can produce widespread harm. That once meant harbors, railroads, ball-bearing works, airports. Now, it’s what comes through the USB connector and the Ethernet port.

8. Grandma’s in the Basement (and Junior’s in the Attic)

Hanna RosinSenior editor, The Atlantic

American families are supposed to disperse. We raise our children, they mature into young adults, and, if all goes correctly, they strike out on their own. That last stage is critical. Unlike the many cultures that rank filial duty above other virtues, Americans value independence. The self-supporting man (or woman) cannot be asking his mommy to do the laundry for him and going after the same Pop-Tart stash he raided at age 10. But lately it seems we might have to adjust that list of priorities. Recent census data show that the number of Americans ages 25 to 34 living with their parents has jumped to about 5.5 million—a figure that accounts for roughly 13 percent of that age range. Compounding this full-house phenomenon, the grandparent generation is “doubling up” too, as the sociological literature says. A recent Pew Center report, “The Return of the Multi-Generational Family Household,” chronicles the trend: during the first year of the Great Recession, 2.6 million more Americans found themselves living with relatives; all told, 16 percent of the population was living in multi-generational households—the largest share since the 1950s.

The spike is just the latest result of a long string of personal disasters brought on by the recession: lose your job, lose your home, find yourself bunking with Mom again and experiencing “alternating surges of shame and gratitude,” as one Slate writer recently put it. For the young people expecting to be independent, the small humiliations are endless: How do you date, invite friends over, feel like a grown-up going to a job interview, when your mom is polishing your shoes? But family members also find, thankfully, moments of small, unexpected connection—while, say, laughing over old movies they used to watch together but haven’t seen in years.

And more broadly, the situation brings one major plus: the American family may finally get a long-overdue redefinition. With all the changes—more than 40 percent of children are born to unmarried mothers, many families include gay parents or adopted children or children conceived via a variety of fertility technologies, couples are choosing to marry but not have children—it seems exclusionary and even cruel to keep defining the American family as a mom and a dad and two biological children. That’s not what our households look like anymore, so we might as well recognize that Grandpa, and some kids too old for ducky barrettes, belong in the holiday photos too.

7. Public Employee, Public Enemy

Jonathan Chait,Senior editor, The New Republic

The collapse of the financial sector led to a series of secondary collapses, including the collapse of the long-term financing of states and towns across the country. And thus public-employee unions emerged from a sleepy little corner in the demonology of American conservative thought to briefly occupy the role of villainus maximus in an ideology-laden fight over the soul of the American workforce.

At the vanguard of this redefining stood Wisconsin Governor Scott Walker. The public unions, argued Walker and a vast array of conservatives rallying to his side, constituted a fundamental menace to public finance, a menace that could be addressed only through virtual eradication. The argument ran like this: Perhaps unions have some role in the private realm, giving workers more leverage against employers seeking relentlessly to maximize their share of the firm’s proceeds, but the logic does not apply to government. Indeed, since the opposite side of the public bargaining table is occupied by disinterested public servants rather than capitalists, and since public unions can influence the outcome of the elections, the public unions are bargaining with ... themselves.

Naturally, Walker’s plan to destroy the public unions rousted liberals, largely asleep since November 2008, into a righteous indignation. They, too, had given little thought to the right of public workers to form a union. But confronted with Walker’s plan, they recoiled. Surely the state could remedy its fiscal problems without dismantling the unions, couldn’t it? After all, the unions had already agreed to fork over concessions.

The conservative damning of the public unions was not entirely wrong, but it was crucially incomplete. A powerful force is, in fact, arrayed against the demands of public unions: the desire of voters to pay low taxes. The trouble is that this desire takes the short view, demanding instant gratification. If an elected official pays his workforce more money, he has to jack up taxes. But if he can arrange to have his workforce paid more money years down the line, when he’s not the one coming up with the cash, he can enjoy the best-of-both-worlds outcome of happy employees and happy voters.

So that is what elected officials have done across the country. They’ve given their workforce reasonably modest wages, but plied them with vast pension benefits. By the time the bill comes due, the politicians who agreed to it will be retired themselves, collecting nice pensions, and perhaps being quoted in the local media opining that the new breed of elected officials doesn’t run a tight fiscal ship, the way they did back in the good old days.

6. Wall Street: Same as It Ever Was

Felix SalmonFinance blogger, Reuters

The warning signs were there. In the decades before the financial world fell apart in 2008, what had been a great many small and diverse intermediaries merged and grew into a few global powerhouses. The new behemoths of finance were generally far too big to manage: with their trillion-dollar balance sheets and cellars full of assets that no one understood, they were a disaster waiting to happen.

These institutions were, literally, too big to fail. Lehman Brothers was one of the smallest, and its bankruptcy forced governments around the world to carry out formerly unthinkable emergency actions just to keep the global economy from completely collapsing. The cost of the bailout ran into the trillions, and unemployment rose as high as 10.1 percent; we can probably never recover fully from the crisis. The ingredients that spelled disaster were simple: bigness, interconnectedness, and profitability.

Big banks, by their nature, are much more systemically dangerous than smaller ones—just imagine the cost to the federal government if it had to cover all the deposits at, say, Bank of America. Lehman is a prime example of the dangers of interconnectedness: because every major bank did a lot of business with the firm every day, the chaos when it suddenly collapsed was impossible to contain, and rapidly spread globally in devastating and unpredictable fashion.

And great profitability, of course, is as good a proxy for risk as any. If someone tells you that he can make huge profits, year in and year out, without taking on big risks, then he’s probably Bernie Madoff.

As of now, not only have we failed to fix these three problems, but we’ve made them all worse. The big banks are bigger than ever, after having swallowed up their failed competitors. (Merrill Lynch, for example, is now a subsidiary of Bank of America; don’t believe for a minute that BofA’s senior management or board of directors has a remotely adequate understanding of the risks that Merrill is taking.)

Interconnectedness, too, has increased. With the bailout came a deluge of liquidity, courtesy of Ben Bernanke: the Fed bailout was tantamount to dropping billions of $100 bills from helicopters over Lower Manhattan. That money got spent on financial assets—that was the whole point—and as a result, financial assets started moving in conjunction with one another. If my shares are rising, your shares are almost certainly rising too. And your commodities, and your municipal bonds, and your Old Master paintings. Because of this increase in financial correlation, if and when another crisis hits, it will be uncontrollable: it’s certain to strike absolutely everything, all at once. And though some people think Congress can simply regulate the problems away, there’s no way to legislate solutions to problems that are endemic to our financial system.

Meanwhile, Wall Street pay is back at record highs—that didn’t take long—and the financial industry once again accounts for more than 30 percent of U.S. corporate profits. This doesn’t look like low-margin utility banking, where you take a small fee for matching buyers and sellers, borrowers and lenders. Beware. This is big-money gambling, back with a vengeance, and riskier than ever.

The miscalculation was costly. The uprisings that rocked a region started when a young Tunisian street vendor opted not to pay off yet another official—and instead set himself on fire at the governor’s office in Sidi Bouzid. Mohamed Bouazizi’s death redefined Mideast martyrdom, as civil disobedience instead of suicide bombs. It triggered protests that toppled dictators. And it has led the United States to abandon some long-standing allies. Many Arab regimes have since lost billions (in uncollected revenues, and in costly security deployments and other expenditures to preempt dissent), and may now rue official greed.

But freedom also comes with a price tag of great expectations. And the uprisings have done little as yet—beyond providing the right to gripe in public—to improve daily life.

Sidi Bouzid’s plaza has been renamed Martyr Mohamed Bouazizi Square for the lanky 26-year-old vendor. Tunisians gained more than 50 new political parties to choose from, but few jobs. Tourism—worth 400,000 jobs—tumbled by 40 percent. Tunisia’s credit rating dropped to near-junk status. Investments took a nosedive. In Sidi Bouzid, scores of young men lined up daily at the governor’s office—where Bouazizi had doused himself with paint thinner—to apply for nonexistent work. The new government put up a fence in fear of further unrest. In just two months, more than 10,000 Tunisians fled to the tiny Italian island of Lampedusa in search of a future. Lampedusa was so overwhelmed that islanders launched their own protests against the Tunisians.

In Egypt, street vendors at Liberation Square started offering T-shirts, trinkets, and face paints in the colors of the Egyptian flag to commemorate President Hosni Mubarak’s ouster. But they had few takers. Tourism reportedly dropped by 75 percent. When Mubarak resigned, more than 20 percent of Egyptians were living below the poverty line. They expected a measure of prosperity after he left, but instead their plight worsened. The new culture of protests sparked demonstrations for better pay and more jobs among pharmacists, railway workers, pensioners, lawyers, doctors, journalists, students, and, in an incongruous twist, among the police once tasked with putting down protests. Uncertainty created a cycle hard to break.

No country now in transition will be able to accommodate demands for either economic security or social justice anytime soon. Demographics don’t help. One hundred million people—one-third of the Arab world—are in the job-hungry age range of 15 to 29. So the early euphoria and momentum will be hard to sustain as the post-rebellion letdown engenders further public discontent. Many countries may face a second crisis, maybe even a series of crises. For all the promise of democratic demonstrations, unresolved rebellions also pose dangers.

4. Elections Work

Gwen IfillModerator and managing editor, Washington Week

As the junior member of The Washington Post’s political team in 1988, I was naturally assigned to cover the candidates least likely to win. That task took me to campaign rallies headlined by two ordained ministers—the Reverends Jesse Jackson and Pat Robertson. These two had little in common. One occupied the left fringe of his party; the other the right of his. But when I arrived at their campaign events, I discovered something I did not expect: aside from skin color, their supporters were shockingly alike. The conservative evangelicals backing Robertson wanted jobs, economic reassurance, and a guarantee that their children would fare better than they had. The liberal Democrats backing Jackson mostly wanted the same things. The solutions they proposed were different, but the problems they identified were similar. In essence, each group was seeking someone who would listen to them, and speak for them—and both groups were frustrated that as yet no one seemed to be doing so.

That same frustration and desire to be given a voice accounts for the Tea Party wave that swept over last year’s midterm elections. Anyone who thinks that that wave has crashed is not paying attention. Fifty-nine percent of the Republicans responding to a CBS News/New York Times poll this past spring said they had favorable views of the Tea Party. Some people on the left, alarmed at the Tea Party’s rise, vowed to flee to Canada.

None of this troubles me. I like it when we’re reminded that our actions at the polls have meaning, and that we have to pay close attention before we cast our votes—or fail to cast our votes. (That means you, Wisconsin union members.) Neither Jesse Jackson nor Pat Robertson came close to claiming his party’s nomination in 1988, but their presence in the political conversation ensured that their disaffected supporters got heard. And, as the Congress members displaced from office by Tea Party candidates learned last fall, those who discount the power of voters to talk back do so at their own peril. Elections matter.

3. The Rich Are Different From You and Me

Chrystia FreelandEditor, Thomson Reuters Digital

The rich are always with us, as we learned from the Bette Davis film of that name, released in the teeth of the Great Depression. The most memorable part of that movie was its title—but that terrific phrase turns out not to be entirely true. In every society, some people are richer than others, but across time and geography, the gap between the rich and the rest has varied widely.

The reality today is that the rich—especially the very, very rich—are vaulting ahead of everyone else. Between 2002 and 2007, 65 percent of all income growth in the U.S. went to the richest 1 percent of the population. That lopsided distribution means that today, half of the national income goes to the richest 10 percent. In 2007, the top 1 percent controlled 34.6 percent of the wealth—significantly more than the bottom 90 percent, who controlled just 26.9 percent.

That is a huge shift from the post-war decades, whose golden glow may have arisen largely from the era’s relative income equality. During the Second World War, and in the four decades that followed, the top 10 percent took home just a third of the national income. The last time the gap between the people on top and everyone else was as large as it is today was during the Roaring ’20s.

The rise of today’s super-rich is a global phenomenon. It is particularly marked in the United States, but it is also happening in other developed economies like the United Kingdom and Canada. Income inequality is also increasing in most of the go-go emerging-market economies, and is now as high in Communist China as it is in the U.S.

These global super-rich work and play together. They jet between the Four Seasons in Shanghai and the Four Seasons in New York to do business; descend on Davos, Switzerland, to network; and travel to St. Bart’s to vacation. Many are global nomads with a fistful of passports and several far-flung homes. They have more in common with one another than with the folks in the hinterland back home, and increasingly, they are forming a nation unto themselves.

This international plutocracy is emerging at a moment when globalization and the technology revolution are hollowing out the middle class in most Western industrialized nations. Many of today’s super-rich started out in the middle and make most of their money through work, not inheritance. Ninety-five years ago, the richest 1 percent of Americans received only 20 percent of their income from paid work; in 2004, that income proportion had tripled, to 60 percent.

These meritocrats are the winners in a winner-take-all world. Among the big political questions of our age are whether they will notice that everyone else is falling behind, and whether they will decide it is in their interests to do something about that.

The death of secrecy isn’t quite upon us, but we’ve seen ample evidence this past year to suggest that it’s probably fast approaching. As they have for the past few years, journalists unearthed an array of classified government subplots that had been designed to remain hidden from public view (topics covered included Afghan financial scandals, the CIA’s drone war in Pakistan, a national-security buildup in the U.S., etc.).

Of course, then along came WikiLeaks and its torrent of revelations. From the Web site of the shadowy Julian Assange sprang everything from Iraq War logs, to profiles of Guantánamo Bay prisoners, to the infamous cables sent from the American Embassy in Tunisia confirming widespread government corruption—once-secret missives credited with helping to spark revolution, which then spread from Tunis across the Middle East. Washington, for its part, condemned, then investigated, and now may try to haul to prison Assange and his cohorts—a response that proves how little our government understands the technological and social revolution happening all around it.

That’s not to say Washington isn’t itself ambling toward transparency. In the days after the raid on Osama bin Laden’s hideout, the Obama administration began handing out dozens of details about the daring mission. Notably, these included the name of the original source of the crucial intel, the disputed methods used in getting him to talk, and the nickname of the courier who guided the CIA to bin Laden. Just about everything that was used to take bin Laden down—telephone intercepts, then Black Hawk helicopters, then a pair of bullets to the head and chest—was laid bare.

The truth is, sources and methods like these are often the only true secrets in the vast and growing sea of classified non-secrets. The White House’s motives, of course, were easy to understand: President Obama wanted to show that his risk-taking had paid off—and who can blame him? All the same, Washington did want to keep some things under wraps. Pakistani intelligence officials, displeased by the covert American raid, outed the CIA station chief in Islamabad. This incident followed tensions earlier this year, when the Pakistani government called for a complete list of CIA employees and contractors in the country, and demanded to know even more. “We need to know who is in Pakistan doing what, and that the CIA won’t go behind our back,” one official insisted in anonymity to The Washington Post. Don’t be surprised if WikiLeaks or journalists manage to provide those answers soon. Forcing the U.S. government to give up its addiction to secrecy in foreign affairs might be a good thing in the long term, although painful in the short term. After all, international relations based on secret-keeping—like relations between people who have something to hide—are inherently fragile.

1. The Rise of the Middle Class—Just Not Ours

Gillian Tett U.S. managing editor and assistant editor, Financial Times

The past year has seen plenty of hand-wringing about the “squeezed middle.” Little wonder. Although the U.S. economy might now be rebounding, incomes for most Americans—if they are lucky enough to have a job at all—are not rising. On the contrary, since 2002, median household income has declined in real terms, as many middle-class jobs have been either destroyed by technological innovation or lost to competition from overseas. For many of the jobs remaining, employers can pay lower wages.

The middle class in America (and Europe) is suffering, but that’s only half the tale. In the past decade, income per capita in the so-called “BRICs” (Brazil, Russia, India, and China) has surged, as the middle classes in those countries have expanded at a striking clip. That is partly because jobs are shifting from the West to the emerging world (just think, for example, of all those Chinese factories and Indian call centers that have sprung up). However, education is also improving in most of these countries, along with infrastructure, as incomes rise and lifestyles improve.

To many Western workers—and politicians—this sounds scary. After all, the addition of millions of well-educated workers in places such as India, China, and Brazil means a lot more competition for Americans and Europeans. However, this cloud has a bright silver lining. Until now, politicians and economists have generally focused on the emerging markets in terms of a “supply shock,” in the sense that these countries can supply cheaper and better goods than can be produced in the West. Production, after all, is what has enabled those emerging-market economies to boom; again, think of those Chinese factories.

But now the world is on the verge of a crucial shift: precisely because the middle classes in the emerging markets are gaining clout, they are also becoming a truly formidable consumption force. The emerging markets thus no longer represent just a “supply shock”; they are creating a “demand shock” too. And that raises big questions: Who or what will meet that demand? Will those new middle-class families who are working at, say, Indian call centers or Chinese factories just buy local products? Or could American companies have an opportunity to serve them? And if so, could that opportunity eventually lead to new American jobs, as those consumers start to travel, read, download apps—and plug in to a globalized lifestyle? The full tale of the “squeezed middle” has yet to be told.

Most Popular

The legend of the Confederate leader’s heroism and decency is based in the fiction of a person who never existed.

The strangest part about the continued personality cult of Robert E. Lee is how few of the qualities his admirers profess to see in him he actually possessed.

Memorial Day has the tendency to conjure up old arguments about the Civil War. That’s understandable; it was created to mourn the dead of a war in which the Union was nearly destroyed, when half the country rose up in rebellion in defense of slavery. This year, the removal of Lee’s statue in New Orleans has inspired a new round of commentary about Lee, not to mention protests on his behalf by white supremacists.

The myth of Lee goes something like this: He was a brilliant strategist and devoted Christian man who abhorred slavery and labored tirelessly after the war to bring the country back together.

On August 21, the “moon” will pass between the Earth and the sun, obscuring the light of the latter. The government agency NASA says this will result in “one of nature’s most awe-inspiring sights.” The astronomers there claim to have calculated down to the minute exactly when and where this will happen, and for how long. They have reportedly known about this eclipse for years, just by virtue of some sort of complex math.

This seems extremely unlikely. I can’t even find these eclipse calculations on their website to check them for myself.

Meanwhile the scientists tell us we can’t look at it without special glasses because “looking directly at the sun is unsafe.”

Just seven months into his presidency, Trump appears to have achieved a status usually reserved for the final months of a term.

In many ways, the Trump presidency never got off the ground: The president’s legislative agenda is going nowhere, his relations with foreign leaders are frayed, and his approval rating with the American people never enjoyed the honeymoon period most newly elected presidents do. Pundits who are sympathetic toward, or even neutral on, the president keep hoping that the next personnel move—the appointment of White House Chief of Staff John Kelly, say, or the long-rumored-but-never-delivered departure of Steve Bannon—will finally get the White House in gear.

But what if they, and many other people, are thinking about it wrong? Maybe the reality is not that the Trump presidency has never gotten started. It’s that he’s already reached his lame-duck period. For most presidents, that comes in the last few months of a term. For Trump, it appears to have arrived early, just a few months into his term. The president did always brag that he was a fast learner.

An analysis of Stormfront forums shows a sometimes sophisticated understanding of the limits of ancestry tests.

The white-nationalist forum Stormfront hosts discussions on a wide range of topics, from politics to guns to The Lord of the Rings. And of particular and enduring interest: genetic ancestry tests. For white nationalists, DNA tests are a way to prove their racial purity. Of course, their results don’t always come back that way. And how white nationalists try to explain away non-European ancestry is rather illuminating of their beliefs.

Two years ago—before Donald Trump was elected president, before white nationalism had become central to the political conversation—Aaron Panofsky and Joan Donovan, sociologists then at the University of California, Los Angeles, set out to study Stormfront forum posts about genetic ancestry tests. They presented their study at the American Sociological Association meeting this Monday. (A preprint of the paper is now online.) After the events in Charlottesville this week, their research struck a particular chord with the audience.

More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.

One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”

Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”

Antifa’s activists say they’re battling burgeoning authoritarianism on the American right. Are they fueling it instead?

Since 1907, Portland, Oregon, has hosted an annual Rose Festival. Since 2007, the festival had included a parade down 82nd Avenue. Since 2013, the Republican Party of Multnomah County, which includes Portland, had taken part. This April, all of that changed.

In the days leading up to the planned parade, a group called the Direct Action Alliance declared, “Fascists plan to march through the streets,” and warned, “Nazis will not march through Portland unopposed.” The alliance said it didn’t object to the Multnomah GOP itself, but to “fascists” who planned to infiltrate its ranks. Yet it also denounced marchers with “Trump flags” and “red maga hats” who could “normalize support for an orange man who bragged about sexually harassing women and who is waging a war of hate, racism and prejudice.” A second group, Oregon Students Empowered, created a Facebook page called “Shut down fascism! No nazis in Portland!”

The nation’s current post-truth moment is the ultimate expression of mind-sets that have made America exceptional throughout its history.

When did America become untethered from reality?

I first noticed our national lurch toward fantasy in 2004, after President George W. Bush’s political mastermind, Karl Rove, came up with the remarkable phrase reality-based community. People in “the reality-based community,” he told a reporter, “believe that solutions emerge from your judicious study of discernible reality … That’s not the way the world really works anymore.” A year later, The Colbert Report went on the air. In the first few minutes of the first episode, Stephen Colbert, playing his right-wing-populist commentator character, performed a feature called “The Word.” His first selection: truthiness. “Now, I’m sure some of the ‘word police,’ the ‘wordinistas’ over at Webster’s, are gonna say, ‘Hey, that’s not a word!’ Well, anybody who knows me knows that I’m no fan of dictionaries or reference books.

Anti-Semitic logic fueled the violence over the weekend, no matter what the president says.

The “Unite the Right” rally in Charlottesville was ostensibly about protecting a statue of Robert E. Lee. It was about asserting the legitimacy of “white culture” and white supremacy, and defending the legacy of the Confederacy.

So why did the demonstrators chant anti-Semitic lines like “Jews will not replace us”?

The demonstration was suffused with anti-black racism, but also with anti-Semitism. Marchers displayed swastikas on banners and shouted slogans like “blood and soil,” a phrase drawn from Nazi ideology. “This city is run by Jewish communists and criminal niggers,” one demonstrator told Vice News’ Elspeth Reeve during their march. As Jews prayed at a local synagogue, Congregation Beth Israel, men dressed in fatigues carrying semi-automatic rifles stood across the street, according to the temple’s president. Nazi websites posted a call to burn their building. As a precautionary measure, congregants had removed their Torah scrolls and exited through the back of the building when they were done praying.

If the president is concerned about violence on the left, he can start by fighting the white supremacist movements whose growth has fueled its rise.

In his Tuesday press conference, Donald Trump talked at length about what he called “the alt left.” White supremacists, he claimed, weren’t the only people in Charlottesville last weekend that deserved condemnation. “You had a group on the other side that was also very violent,” he declared. “Nobody wants to say that.”

I can say with great confidence that Trump’s final sentence is untrue. I can do so because the September issue of TheAtlantic contains an essay of mine entitled “The Rise of the Violent Left,” which discusses the very phenomenon that Trump claims “nobody wants” to discuss. Trump is right that, in Charlottesville and beyond, the violence of some leftist activists constitutes a real problem. Where he’s wrong is in suggesting that it’s a problem in any way comparable to white supremacism.