What ever happened to the misery index?

Once upon a time, the misery index was as ubiquitous a concept in economics as the unemployment rate.

The measure, which was developed in the 1970s by Brookings economist Arthur Okun, combined the unemployment rate and inflation rate in an attempt to encompass in a single figure the economic suffering of the American people.

High unemployment and inflation were such problems in the late 1970s and 1980s that the misery index was invoked not just by economists but also by political candidates in both the 1976 and 1980 presidential elections to illustrate the incumbent’s failure to solve the pressing problems facing the economy.

Fast-forward to today, and the problems of the economy are almost entirely reversed. Earlier this month, the misery index hit a 56-year low. Yet you’ll hear no politician, let alone presidents or presidential hopefuls, reeling off this statistic. That’s because yesterday’s misery was inflation; today, it’s low and slow-growing wages.

Though the economic foe has changed, the argument over how to defeat it is remarkably similar to the arguments over how to drive the misery index lower in the early 80s. Just like today, political parties at the time didn’t always adjust their policies to fight inflation. Instead, they adjusted their rhetoric to argue why the policies they had already been pushing were just the thing needed to solve the economic problems of the time.

When President Ronald Reagan came into office in 1981, he disregarded warnings from Fed Chair Paul Volcker and pushed for a deficit-expanding tax cut package that would, according to standard economic theory, exacerbate inflation. This bears resemblance to the current Republican struggle for lower marginal tax rates or the Democratic push for a higher minimum wage. Neither policy will likely do anything to raise the median American worker’s pay, but you wouldn’t know it from reading the political press.

In the early 1980s, all eyes were on the Federal Reserve to solve the all-consuming problem of inflation. At one point, Volcker was forced to raise the federal funds rate to 20%, and interest rates for bank and car loans were even higher. Editorials in publications like American Banker beseeched Congress to do something to help Volcker slay the inflation dragon. “The Fed cannot stop inflation on its own. The central bank will keep slugging and there will be more of the whirling 1980 gyrations so long as monetary policy is the only anti-inflation weapon in town,” it wrote back in January 1981.

Today, we have economists on the left, like Jared Bernstein (the former chief economist to Vice President Biden) cheerleading the Fed’s unconventional policies aimed at propping up employment and wages, but warning, “the Fed can help, and Ms. Yellen has been a smart, forceful advocate of that position, but it can’t do so alone.”

Both today and in the early 1980s, some in Congress were convinced that Federal Reserve policy was just about the most dangerous threat to the country. Today, Republicans like Rand Paul believe the Fed is stoking inflation and have submitted legislation that would subject the Fed to further oversight of their interest rate decisions.

Thirty-five years ago, it was the Democratic side of the aisle that was agitating against the Fed, when then North Dakota Representative Byron Dorgan submitted legislation that would have enabled Congress to oust a Fed Chair before his term expired. Dorgan called the bill the “Paul Volcker Retirement Act,” saying, “”It is time for us to stop entrusting a full half of the nation’s economic policy to an insulated clique of big bankers and money brokers called the Federal Reserve System,” according to a report in The American Banker.

Eventually, Volcker won his battle with inflation, despite the fact that Reagan’s policies helped more than double the deficit during his first term to 6% of GDP. The conventional wisdom today is that it was the steely reserve of Volcker, who kept rates high and the country in a recession in the face of political opposition, that killed off inflation. But that narrative has been challenged in recent years by those who argue inflation would have declined anyway due to the declining power of unions, which threw their weight around to fight for constant wage increases, and the end of oil supply shocks.

Whether or not you believe Volcker deserves the reputation he has today, it’s clear that he was in a very similar position 35 years ago to what Yellen is in today. The Fed was dealing with a problem it never had to solve before, using untested methods, in the face of forceful political resistance. And policymakers in Washington, as they are today, were trying to mold the situation to their own advantage, presenting their long-beloved policy prescriptions as the silver bullet for the challenges of the time.

Meanwhile, it’s entirely possible that larger forces in the economy, like the supply and demand for oil and the trajectory of the labor movement, were actually exerting a much greater influence on the economy than what squabbles over interest rates and tax cuts ever could.

How much would you pay for a Nobel Prize in economics? Now’s your chance.

Most of the time, to get your hands on a Nobel Prize in economics, you need some serious brain power, decades of hard work on an esoteric subject, and the universal respect of the economics community. Now, to join the ranks of Krugman, Hayek, Myron and Scholes, all you need is a small fortune and the desire to spend it.

That’s right, this week, the Los Angeles-based auction house Nate D. Sanders will be accepting bids online for the Nobel Prize awarded to Belarusian-American economist Simon Kuznets, the third ever recipient of the Nobel Prize in economics. (If you are thinking about buying this piece, you should probably be aware that the economics prize is kind of the red-headed stepchild of the Nobel, as it was never endowed by Alfred Nobel himself, but by the Swedish Central Bank, decades after Nobel’s death).

That said, if you’re gonna shell out $150,000 on a Nobel Prize in economics—that’s where bidding starts—allow me to humbly suggest that this be the one. While every Nobel Prize-winning economist has made a great contribution to the science of economics, you’d be hard pressed to find someone who has had a bigger impact on the world beyond economics than Simon Kuznets.

Kuznets, who died in 1985, is the grandfather of the concept of gross domestic product, a statistic that gets bandied about every day in the press, by politicians, and activists. It wouldn’t be a stretch to say that, for some, GDP is the single best measure of human progress. The ubiquity of the statistic might lead some to think of it as timeless, but, in fact, it is quite young. The U.S. government only grew interested in calculating estimates of national income—of which GDP is one measure—during the depths of the Great Depression.

The idea of computing national income goes back to at least the 17th century. Private groups, like the National Bureau of Economic Research and the National Industrial Conference Board in the U.S., had been working to compute national income statistics in the United States in the years leading up to the Depression. But there was little agreement over what should be included in these measures. One bone of contention, for instance, was whether it should include household work, like child care or the preparation of meals.

Kuznets delivered his estimates to the Senate in 1934, putting numbers to the great suffering during those dark days. He showed that between 1929 and 1932 national income had fallen by more than 50%, and the income of wage earners had fallen 60%. These statistics helped guide the U.S. government’s efforts to stimulate the economy out of the Great Depression and the “recession within the depression” that struck the country in 1937.

Over the years, Kuznets and his colleagues tinkered with their methods, developing gross national product (GNP) in 1942, which measures the value of goods and services produced within the borders of the U.S. Unlike national income, GNP ignores the depreciation of capital stock. This was essential to understanding America’s war potential because it measured the production capability of the assets within our borders, regardless of whether foreigners owned those assets. The economics world gradually came to favor gross domestic product in the 1990s, partly as a reflection of the globalization of the U.S. economy. GDP, unlike GNP, measures the output of assets owned by U.S. citizens or entities that are both inside and outside the country.

Even today, there is a fierce debate over the worth of statistics like GDP. Some economists on the right side of the political spectrum question the wisdom of including government spending in our measures of national income because there is no market test for whether these expenditures do anything to improve welfare or make us richer. Those on the left criticize GDP for its failure to account for environmental degradation or the depletion of resources. If an oil company extracts $1 billion of oil from the ground and burns it, did we really become $1 billion richer? Or do we simply have less oil in the ground and more pollution?

The criticisms of GDP are numerous. In 2008, then French President Nicolas Sarkozy commissioned a group of economists, including Nobel laureate Joseph Stiglitz, to devise a new measure of national wealth that would include measures of quality of life and the sustainability of wealth, rather than just raw production figures.

Despite the criticisms, it’s unlikely governments will revise how they measure economic progress in the near future because the proposed alternatives involve a most difficult question: what constitutes well being and wealth? Considering the U.S. government can’t even tackle an overhaul of its tax code, asking it to address these loftier questions is probably a bit much.

But the debate over GDP is testament to just how important it is. We need these numbers to help us understand economic progress. Policy makers in the early 20th century likely failed to react as effectively to the Great Depression as they could because they lacked such tools at their disposal. In an age when we’re drowning in big data and statistics, such a scenario is hard to imagine. So, if you’re a history buff or an econ nerd—and the kind of person who will spend six figures on a piece of memorabilia—take a look at Kuznet’s prize.

Maybe the Obama Admin is right about free trade after all

The chances of the Obama Administration scoring any kind of major policy victory this year took a big hit when it became clear that there would be a bipartisan movement to torpedo his Trans-Pacific Partnership trade agreement if it didn’t include protections against currency manipulation.

At first, it looked like the opponents of TPP had the evidence on their side. After all, Americans, and their elected officials, have been focusing on wage stagnation. After all, free trade efforts have not been helped by the fact that the proliferation of free trade agreements has coincided with a decades-long stagnation in wage growth.

Opponents of the Trans-Pacific Partnership are now zeroing on currency manipulation, arguing for stronger protections to prevent trading partners from artificially driving down the dollar value of their currencies to give their exporters an advantage. In a recent study, economist Robert E. Scott estimated that the U.S. has lost hundreds of thousands of jobs to places like Mexico and South Korea due to a lack of protection against currency manipulation.

Those who argue against free trade agreements often point to the fact that the nation’s flagship deal, the North American Free Trade Agreement, coincided with a huge increase in America’s trade deficit with Mexico in particular. Before NAFTA was signed, the U.S. had a slight trade surplus with Mexico and a $30 billion trade deficit with Canada. Today, the trade deficit with those two nations totals more than $180 billion.

But in a new report from the centrist think tank Third Way, authors Jim Kessler and Gabe Horwitz argue that while NAFTA may not have “lived up to its promise” of boosting the U.S. economy, trade negotiators have become more adept at including higher labor and environmental standards in the many trade deals that followed NAFTA. The result, according to Kessler and Horwitz, is that the vast majority of trade deals signed into law by Congress in the 21st century have shrunk, rather than increased, the trade deficit and therefore have helped create jobs. They write:

While some 20th Century trade deals didn’t always live up to their promise, deals in the 21st Century have generally been negotiated with higher standards making them a better proxy for the likely impact of new deals like the Trans-Pacific Partnership. In this paper, we analyzed all of the U.S. trade agreements that went into effect since the turn of the century—with 17 countries in all since 2000. Because the U.S. has such a consistent and overwhelming trade surplus in services, in this paper we looked only at whether these deals improved the U.S. balance of trade in goods. Our analysis found the following: nearly all recent trade deals have improved our balance of trade in goods, and in the aggregate the gains have been substantial.

They find that trade deals with 13 of 17 countries since the year 2000 have led to a decrease in the trade deficit, and that in the aggregate, “the balance of trade for goods improved after implementation by an average of $30.2 billion per year in 2014 dollars.”

Such evidence could help convince wavering Congressmen to support TPP, a trade agreement that would cover 40% of the global economy and is a pillar in the President’s plan to bolster America’s influence in Asia. But disagreement over the effects of trade agreements, with supporters pointing to post-2000 deals and detractors pointing to deals with places like Mexico and Korea—which seem to have been bad deals for U.S. workers—shows the limitations of using economics as a policy guide.

The global economy is a very complex engine and it’s difficult to isolate the effects of any policy, even one as big as a free trade agreement. It’s quite possible that U.S. companies would be investing more abroad and wages would be stagnant here at home even if the U.S. didn’t sign such trade treaties with other countries. Economics is a science that performs natural experiments with no control groups. Hard evidence is tough to come by.

The cost of winter storms: Should we believe economists?

We hear it every winter: pronouncements that a blizzard cost the U.S. economy hundreds of millions of dollars.

But in an era when so much business can be done over the phone and online, these estimates leave some scratching their heads. Could a simple snowstorm really cost the U.S. upwards of $1 billion, as some economists predicted of January’s blizzard that shut down New York City?

The short answer: yes, it can.

But it’s not that simple. First of all, the media knows that by pointing out that something “costs” $1 billion, it will get people’s attention. After all, if a reader could somehow get ahold of that money, he and his family could avoid working for generations to come. It’s a lot of money.

But for the $17 trillion U.S. economy, it’s chump change. It’s .005% of the country’s annual output. On a per-capita basis, it’s $3.17. On a conceptual level, it would probably be more honest to write that the next storm could cost you $3.17. (In fact, for the median Joe it would be much less, since income is unevenly distributed). That at least communicates the magnitude of the effect on the economy in a way that the average person can understand. Of course, if news reporters did really put these figures into such context, the public would realize that a story about the economic impact of an average blizzard wouldn’t be worth reading at all.

That said, where do these hundreds of millions of lost dollars actually come from? According to Doug Handler, an economist at IHS Global Economics, one of the biggest hits the economy takes is in the form of lost wages for hourly employees. While salaried office workers get paid even if they can’t make it to work and can likely even work from home, hourly employees aren’t so fortunate. And it’s not as if all these workers can simply make up the money on another day.

The economy also takes a hit from lost sales at places like restaurants or retailers. Some of that consumer spending gets made up at a later date. If you are, for instance, planning to buy a car, a snowstorm will merely delay your purchase. But if you had planned to eat out the night of the storm, it’s not certain that you will still buy that meal another time. Another big contributor to lost economic activity is cancelled plane flights. It’s costly to cancel and reschedule flights, and in many cases, some routes will not be rescheduled at all.

There is a certain point at which a snowstorm can go from being a small economic obstacle to something a bit more pernicious. That’s what happened in the winter of 2013-2014, when a series of severe snowstorms across America helped the economy shrink by 2.1% in the first quarter of 2014. In that instance, the bad weather was persistent enough to mess with supply chains. Trucks and air freight couldn’t get to their destinations, mucking up inventory management. That led not only to fewer consumer purchases, but also to companies buying less than they otherwise would.

But even in the case of sustained periods of very bad weather like what we saw in the Northeast and Midwest last winter, estimating the economic effects involves a lot of guesswork. Economists know that output fell in the first quarter of last year in a way that isn’t consistent with broader economic conditions, and they also know that snowfall levels were much higher and temperatures were much lower than usual in some of the most productive areas of the country.

Economists also make use of qualitative and quantitative survey data from the Federal Reserve and private groups to connect the dots. But in the end. it’s impossible to know for sure how big a role weather plays in the economy. As Handler says, “it’s more an art than a science.”

Gen Y is apparently suspicious of tradition: according to a new report, millennials are over conventional advertising, home ownership and the promise of social security.

Millennials value authenticity more highly than the content itself when consuming news, according to a survey of 1,300 Elite Daily readers, nearly all in the millennial age range between the ages of 18 and 35, released by Elite Daily and Millennial Branding. As a result, they are relying on new ways to consume media, with 33 percent selecting blogs as their top media source. Fewer than 3 percent rank television news, magazines and books as influencing their decisions and only 1 percent said a compelling advertisement would make them trust a brand more.

“They’re used to not trusting CEOs and politicians and just corporations in general,” says Dan Schawbel, founder of the Gen Y research and management consulting firm Millennial Branding. “That’s why they like blogs so much. Blogs, they feel, are written by an individuals, there’s typically not an agenda, and it’s a personal account of their thoughts and how they’re feeling, and so they can better align with that, especially if the content is written by someone who understands them or someone who is a millennial themselves.”

The rejection of traditional media as inauthentic comes part and parcel with the trend of millennials discarding other norms accepted by previous generations. For example, the majority of millennials (59 percent) would rather rent a house than purchase one. As a result, homeownership, once the mark of adulthood, is increasingly delayed.

Why are millennials trashing tradition? Don’t just blame millennials’ delayed marriages and love for newfangled technology. Most of the individuals surveyed in the study have come of age in a rough economic climate, something that has had a huge effect on their spending habits.

Around three-quarters of millennials surveyed believe that the economy has negatively impacted their ability to save and spend money. In fact, homeownership isn’t down simply because millennials are turned off by tradition – 61 percent say that they just can’t afford to buy a home right now. Millennials aren’t expecting things to get better either, with 62 percent saying that they don’t believe they will receive social security at age 66.

“They’re definitely less trusting. They don’t want to have the same careers as their parents. They are brand loyal, but to get their attention is much harder,” says Schawbel. “There’s a reason why brands spend so much money trying to reach them… They’re definitely harder to get, [but] there are 80 million in America, so you can’t really avoid them.”

Will more women economists lead to a stronger economy?

Here’s a math problem: Most trained economists missed the warning signs leading to the 2008 financial crisis. If most of those economist decisions-makers boasted PhDs heavy on the quantitative and theoretical, what’s missing from this equation?

You got it: Women.

It’s become cliché to joke that if Lehman Brothers was Lehman Sisters, the U.S. economy would not have collapsed under the weight of risky credit. The assumption, of course, is that women are more cautious, less prone to high-testosterone financial plays. And there’s certainly some truth to that.

But what if we looked at the impact of the shortage of women in economics differently? What if a key blinder to warning signs pre-2008 could be found in an over-reliance on narrow-thinking PhDs deploying statistical models and assuming that the free market is “rational”?

Women tend to avoid economics in part because of an off-putting heavily quantitative PhD route. What if more women economists were at the decision-making table bringing a more human, intuitive, and policy-oriented perspective (even the ones with PhDs)?

It’s a question worth asking as women start to puncture the glass ceiling of decision-makers atop international finance: The IMF’s Christine Lagarde has been joined by Janet Yellen, who is about to celebrate her first year atop the Fed, and the recent arrivals of Nemat Shafik (deputy governor), Kristin Forbes (monetary policy committee member) and Charlotte Hogg (chief operating officer) at the Bank of England.

Already, we’ve seen women contribute to outside-the-box thinking leading to important economic reforms. For years, Goldman Sachs’ Kathy Matsui produced work demonstrating that Japan’s stagnant economy was suffering from low female participation rates in the workforce. (Young Japanese women excel in college but typically drop their jobs and stay at home when they marry. ) Then Japanese Prime Minister Shinzo Abe came back into office looking for reform ideas and borrowed those numbers to make “womenomics” a cornerstone of Abenomics.

On U.S. campuses, men still outnumber women in economics majors by nearly 3 to 1, according to research by Harvard’s Claudia Goldin. A big reason =that they are turned off is the emphasis on the quantitative and theoretical. That’s not because they are bad in math (Goldin’s research shows comparable math scores among economics majors). Rather, they find it intimidating because they’re insecure about their finance skills.

There is also what I call “the good girl syndrome.” Goldin writes that if a woman gets less than an A in an introductory class she gives up, while the guy with a C just moves on.
Just listen to two of my own Harvard students, part of my 2012 Kennedy School Institute of Politics seminar. Both are now seniors majoring in economics. “I do think that economics departments that are more quantitative heavy could be intimidating for women and play into certain social perceptions of what women are good at,” says Sharon Stovezky. “The Harvard introduction course focuses more on intuition and less on the math. I think that’s part of the reason Harvard has a high percentage of women economics concentrators.”

Megan Prasad studies economics with an emphasis on developing nations, but this Rhodes scholar finalist first had to overcome her insecurities. “I almost didn’t write a senior thesis,” she says, “but I met an adviser who encouraged me to pursue my ideas and told me I was actually good at econometric analysis. It seems silly, but I needed to hear her say that before I thought of myself as capable.” (Likewise, Stovezky had to force herself to overcome ingrained fears of computer science. “I had bought into the idea that women are not good at CS. I was actually pretty mad at myself, that I fell into this trap,” she says. “It happens to even the most smart and ambitious women sometimes.”)

Prasad was drawn to economics to better study gender disparities in South Asia. Which brings us to another observation of Harvard economist Goldin: Women get turned off because they see only a limited career path, mostly into Wall Street. “Many young women don’t seem to understand that economics is also for those …with research and policy interests in health, education, poverty, crime, obesity, the environment, terrorism or infectious disease.”

Some prominent women economists are not even, well… economists. Lagarde is a lawyer who confesses to hating math as a teen in high school. And the only woman to ever receive a Nobel in economics (2009), Elinor Ostrom is a political scientist.

“The good news is that you are getting more women that are having a voice in important economic decisions that don’t come from classic PhD paths,” says Heidi Crebo-Rediker, who was appointed by Hillary Clinton as the State Department’s first chief economist and is now CEO of the DC-based International Capital Strategies.

Movie theaters should jack up prices on Thanksgiving weekend, but they won’t

Retailers aren’t the only ones preparing for a rush of business this holiday season. The Thanksgiving and Christmas holidays are a busy time for the movie business too, with average weekly attendance over Thanksgiving weekend spiking, followed by an even more pronounced surge in attendance during Christmas.

But unlike, say, the airline industry, movie theaters don’t take advantage of this surge in demand by charging its customers more. A 2007 paper published by Barak Orbach and Liran Einav argues that “anecdotal evidence indeed indicates that variable pricing could increase revenues” for theaters. They point to studies conducted in the 1970s showed that lowering prices during weekdays increased box office revenues and concession sales. Theater owners overseas have also had success by discounting tickets during slow periods or charging premiums for big blockbuster movies like Jurassic Park.

But despite this evidence, and the sound economic theory that supports changing prices to reflect waxing and waning demand, theaters in America more or less charge the same price for a ticket regardless of the how popular the movie is or when it is playing.

Orbach and Einav explain that this wasn’t always the case. There was widespread use of variable pricing in the industry from the 1910s until the 1950s, and the practice continued in some theaters up until the early 1970s. According to Orbach and Einav, film distributors, which often owned major stakes in the most popular movie theaters, devised a system in which everyone benefitted from variable pricing:

Theaters were classified according to their affiliation, luxuriousness, age, and location. Based on this classification, a “run-clearance-zone” system was established. In any defined geographic location, a given movie played at one theater, and another theater within the same zone could show the same movie only after a defined period lapsed.

Distributors also graded movies as A, B, or C depending on their budgets and the popularity of leading actors. Admission prices for A movies were greater than B, while B movies cost more than the C variety. These practices created a system where ticket prices varied depending on when and where movies played and how much it cost to make them.

This system began to unravel in the late 1940s, when a recession and the advent of television dealt a huge blow to the movie production industry. “Adjusted to 2002 dollars, box office receipts, which hit a record of $15.6 billion in 1946, bottomed out at just $5.5 billion in 1964, and then gradually climbed to $9.5 billion in 2002,” write Orbach and Einav. The business reacted to this sea change by cutting back on the supply of movies—mostly by eliminating lower-priced B and C movies. In addition, the famous Paramount anti-trust case of 1948 broke up the system that gave movie distributors an interest in the variable pricing scheme.

That case was supposed to dismantle the cozy relationship between movie theaters and distributors, but even today “most theaters are informally affiliated with specific major distributors,” write Orbach and Einav. They find that the Paramount case and state laws aimed at “creating a competitive bidding market in which theaters have access to all distributors” have failed. And the benefits of the system that eventually emerged accrued to movie studios and distributors, rather than the theaters themselves. Theaters make most of their profits from concessions and would benefit from a pricing scheme that kept their houses full no matter the time of day or year.

The reliance on uniform pricing has had some unintended ethical consequences, too. For instance, this weekend, TheNew York Times ethicist column dealt with a reader who liked purchasing (at a movie theater with reserved seating) an extra seat next to him so that he could enjoy the additional space. The ethicist frowned on this practice, writing:

If you buy an extra ticket for a movie playing in a theater that’s only half full, you’re actually being more morally conscientious than necessary. Most of the time, finding an empty third seat is not that difficult — but by paying for a third ticket, you are admitting your intentions up front and ensuring that you won’t extract more from this experience than you invested. That’s good citizenship. If the theater is full, however, you should never do this…. You are placing your own gratuitous comfort … above a stranger’s ability to merely experience the same film. Under those conditions, the benefit you are taking for yourself is smaller and less meaningful than what you’re taking away from another person. That’s bad citizenship.

An economist would look at this question differently. He would argue that it’s impossible for an outsider to say one movie goer’s comfort is less valuable than a stranger’s ability to merely experience the film. What if, for instance, a movie goer is so anthrophobic that he can’t enjoy a movie without a buffer seat next to him?

Furthermore, the ethicist argues that it’s actually an example of good citizenship to buy an extra seat, but only when the movie isn’t sold out. But in most situations, a movie goer won’t likely know whether a movie will sell out when he is buying his tickets, especially if he buys his tickets well ahead of the showing. By raising prices during times of high demand and lowering them during times of slack demand, theaters would make it less likely for a patron to buy extra comfort in the form of extra seats.

Prices can communicate of all sorts of information, but only when they are allowed to change in the face of shifting supply and demand.

These changes in monetary policy sent the Euro below $1.24 in early November, its weakest in over two years, making European goods and tourism less expensive abroad. This did not lead to rejoicing on the streets of European cities, however.

“They ought to encourage consumption to reactivate the economy,” said Luis Carrasco, 42, as he stood in front of a Barcelona restaurant where he is a chef.

Many economists would argue that people like Carrasco do not understand that the ECB is doing just that, albeit indirectly; lowering rates and buying assets should get credit flowing, which in turn should increase house prices and household spending.

But another school is arguing that Carrasco is exactly right, and that what European households need is not contorted monetary policy changes but, rather, an influx of cold, hard cash. The ECB should send each European adult a check, they say, and let him spend it how he likes.

John Muellbauer, a professor of economics at the University of Oxford who recently published a piece titled, “Quantitative Easing for the People,” says that sending €500 euro checks to every EU citizen would cost a sixth of the €1 trillion Draghi talks about spending on asset purchases, and would increase EU consumer spending by €34 billion, or 1.4% of GDP.

London-based fund manager Eric Lonergan, along with Brown University professor Mark Blyth wrote the recent Foreign Affairs article “Print Less but Transfer More,” which argues that transfers of between €500 and €1,000 per capita would have a far greater effect on consumption than the ECB’s plans.

“It’s much more efficient and sensible to directly target household spending,” Lonergan says.

While stimulating the economy via society-wide handouts may seem silly or simplistic, it’s an idea with a long history. In the 1936 economics classic General Theory of Employment, Interest and Money, John Maynard Keynes suggested that the treasury could juice the economy by burying bottles of cash in coal mines and having people dig them up.

And on the other side of the political spectrum, Milton Friedman noted in his essay “The Optimum Quantity of Money” that dropping cash to people from a helicopter would have a similar effect.

“It’s funny that the left and the right are unified on this policy,” jokes Muellbauer.

But while Keynes and Friedman were speaking with tongue in cheek, governments have performed a kind of cash handouts in the recent past. During slow economic times in 2001 and 2008, the U.S. gave out tax rebates, giving economists a chance to study the effect handouts would have.

While much economic theory holds that people would not spend the money, the opposite happened. According to a study from the National Bureau of Economic Research (NBER), households spent between 20% and 40% of their rebates on non-durable goods, like food, during the three-month period in which they received their checks, and they spent another third of the rebate during the following quarter. The poor spent more than the rich.

And in 2008, another NBER report found that households spent between 50% and 90% during the quarter they received the rebate. The spending was on a mix of non-durable and durable goods (particularly cars).

Two of the beauties of the giveaway approach, says Lonergan, are that it would be applied equally to all EU citizens, no matter the country, and it would let people do what they want with their money instead of pushing them to buy assets, as cheap credit does.

“If I’m a Spanish or Italian household and I’m in a bad situation, I’ll use it to improve my situation,” he says. “If I have debt, I will use it to repay it. If I have more money, I may take a holiday or put it in a pension fund. Sometimes it’s good to let people to do what they want. They know what’s best for them.”

The idea that people would spend the money (instead of putting it under the mattress)—and in a variety of ways—was backed up on the streets of Barcelona.

“In Spain, the majority of us have debts, so we’d use it to pay debt,” said Juan Galera, 53, a self-employed mason working in the Tres Torres neighborhood. “I’d use it to pay one mortgage payment.”

Several blocks away, Pepi, who was cleaning the sidewalk outside of a dental clinic, said if she received a check, she would buy food and use it to pay her rent and electricity bills.

Of course, after years of pushing austerity policies, no one expects the EU’s greatest economic power, Germany, to embrace a handout scheme any time soon. And there is another problem with the plan: while the idea was well received, a €500 handout wasn’t exactly embraced as a panacea.

“Would it solve the crisis? That’s another question. €500 doesn’t go very far,” said Galera.

The great China growth debate: Ripe for a slowdown or full speed ahead?

With Europe mired in what can now be fairly described as a depression, and the United States barely growing above its expected, long-term trend, economists are desperate to find some other source of global economic growth that can be counted on to push the world forward.

That’s one reason why economists have been paying such close attention to China lately. The Chinese economy is a fantastic success story, growing by roughly 10% a year from 1980 until 2010. Growth has slowed somewhat in China in the past few years, but it’s still coming in at somewhere between 7% and 8% annually. And that’s for an economy that’s already adding more than $9 trillion a year to global output, according to World Bank estimates, the second highest contribution after the United States.

China is integral to global growth, and that’s one reason why people took notice when Larry Summers published a working paper in October arguing that projections for the Chinese economy were way off the mark. The noted economist and former Treasury Secretary wrote that China’s long run of outstanding growth will likely come to an end soon. Summers points to two trends to support his argument. The first is that after 30 years of truly outstanding growth, it would be anomalous for the Chinese economy to continue to grow well above the world average as mainstream organizations like the World Bank predict it will. In the paper, which Summers co-wrote with economist Lant Pritchett, the authors write:

Many of the great economic forecasting errors of the past half century came from excessive extrapolation of performance in the recent past and treating a country’s growth rate as a permanent characteristic rather than a transient condition. Paul Samuelson’s textbook predicted in 1961 that there was a substantial chance that the USSR would overtake the United States economically by the 1980s. There was a widespread view right up until the end of the 1980s that Japan would continue to grow and outcompete the world. Or in the opposite direction, consider the pervasive pessimism of even a decade ago regarding Africa. Since then, African countries emerged as a majority of the world’s most rapidly growing nations.

With this in mind, Summers and Pritchett analyze the growth patterns of a broad collection of developing economies, finding that it is much more common for a country to grow slowly after a long period of rapid growth than for that country to continue to perform as it had been. Furthermore, they find that it is much more likely for wealthy countries to be democratic than authoritarian. The only country in recent history to achieve the status of a developed nation while remaining relatively authoritarian is Singapore, a nation of just 5.4 million people. Summers argues that it would be doubtful that a country of 1.3 billion people like China could emulate a tiny island state and, in any case, even Singapore remains far more democratic than China by most measures.

In other words, for China to reach the 6.6% per capita growth the World Bank expects it to achieve between 2011 and 2033, it would have to do something no economy has ever done before. It’s possible that this will happen, but how likely is it really?

There are many economists, however, who are dubious of these doubters. They argue that China is a special case. Australian economist Stephen Grenville of the Lowry Institute argues that it’s not wise to predict the future growth of China based on experiences of economies so vastly different from it. “There is a range of experiences—often widely different—hidden within the average. True, Brazil had more than two decades with no growth at all in per capita income. On the other hand Taiwan, Singapore, South Korea and Hong Kong (and earlier, Japan) had quite long periods of fairly sustained economic growth which have taken them to high levels of per-capital income. What relevance does Brazil’s failed growth experience have for China?”

Carl Weinberg, chief economist at High Frequency Economics, sees no reason for Chinese growth to slow down markedly in the near future. He argues that China perfectly fits the model of a “modernizing” economy, one that is defined by a growing population; sustained productivity growth; a rapid growth in urban relative to rural population; social changes like secularization; and increased trade. China, says Weinberg, is a special case because of how closely it resembles other modernizing economies, like the United States, during their rise. “China is a modernizing economy,” he wrote in a recent note to clients. “Most of the countries in Larry [Summers’] sample are not. China’s experience is different.”

Weinberg argues that, while China might not be a model of democracy, its government has made progress in allowing more freedom, especially when it comes to private enterprise. As the country continues to urbanize and per capita income continues to rise, there’s reason to believe this process of liberalization will continue.

In previous years, this may have been an academic discussion, pertinent to economists and a few investors, but not the wider population. But as China’s economy continues to grow, what happens in China will increasingly affect the entire world. It’s expected that sometime in the next few years, China will become the largest economy in the world, and the difference between 6% and 2% yearly growth could spell the difference between a strong or stagnant economy in the U.S.

Wealth inequality in America: It’s worse than you think

For the true believers in laissez faire economic policy, the recent and ongoing national discussion over income and wealth inequality probably seems like it was started as a cynical ploy for those on the left to gain a political advantage. After all, if rising inequality is a problem, you would be hard pressed to find any solutions offered by the right wing.

It would be laughable to argue that left-leaning politicians aren’t using the issue for political advantage. But focusing on that fact alone misses one of the main reasons we have begun to pay more attention to inequality, which is the fact that we have better tools for measuring and understanding inequality than ever before. This is thanks to the work of economists like Emmanuel Saez and Gabriel Zucman, who have dedicated their careers to compiling and analyzing wealth and income data. Without these numbers, advocates for concerted effort to combat inequality would have no foundation for their argument.

Saez and Zucman released another working paper this week, which studies capitalized income data to get a picture of how wealth inequality in America, rather than income inequality, has evolved since 1913. (Income inequality describes the gap in how much individuals earn from the work they do and the investments they make. Wealth inequality measures the difference in how much money and other assets individuals have accumulated altogether.) In a blog post at the London School of Economics explaining the paper, Saez and Zucman write:

There is no dispute that income inequality has been on the rise in the United States for the past four decades. The share of total income earned by the top 1 percent of families was less than 10 percent in the late 1970s but now exceeds 20 percent as of the end of 2012. A large portion of this increase is due to an upsurge in the labor incomes earned by senior company executives and successful entrepreneurs. But is the rise in U.S. economic inequality purely a matter of rising labor compensation at the top, or did wealth inequality rise as well?

The advent of the income tax has made measuring income much easier for economists, but measuring wealth is not as easy. To solve the problem of not having detailed government records of wealth, Saez and Zucman developed a method of capitalizing income records to estimate wealth distribution. They write:

Wealth inequality, it turns out, has followed a spectacular U-shape evolution over the past 100 years. From the Great Depression in the 1930s through the late 1970s there was a substantial democratization of wealth. The trend then inverted, with the share of total household wealth owned by the top 0.1 percent increasing to 22 percent in 2012 from 7 percent in the late 1970s. The top 0.1 percent includes 160,000 families with total net assets of more than $20 million in 2012.

Saez and Zucman show that, in America, the wealthiest 160,000 families own as much wealth as the poorest 145 million families, and that wealth is about 10 times as unequal as income. They argue that the drastic rise in wealth inequality has occurred for the same reasons as income inequality; namely, the trend of making taxes less progressive since the 1970s, and a changing job market that has forced many blue collar workers to compete with cheaper labor abroad. But wealth inequality specifically is affected by a lack of saving by the middle class. Stagnant wage growth makes it difficult for middle and lower class workers to set aside money, but Saez and Zucman argue that the trend could also be a product of the ease at which people are able to get into debt, writing:

Financial deregulation may have expanded borrowing opportunities (through consumer credit, home equity loans, subprime mortgages) and in some cases might have left consumers insufficiently protected against some forms of predatory lending. In that case, greater consumer protection and financial regulation could help increasing middle-class saving. Tuition increases may have increased student loans, in which case limits to university tuition fees may have a role to play.

So, why should we care that wealth inequality is so much greater than even the historic levels of income inequality? While inequality is a natural result of competitive, capitalist economies, there’s plenty of evidence that shows that extreme levels of inequality is bad for business. For instance, retailers are once again bracing for a miserable holiday shopping season due mostly to the fact that most Americans simply aren’t seeing their incomes rise and have learned their lesson about the consequences of augmenting their income with debt. Unless your business caters to the richest of the rich, opportunities for real growth are scarce.

Furthermore, there’s reason to believe that such levels of inequality can have even worse consequences. The late historian Tony Judt addressed these effects in Ill Fares the Land, a book on the consequences of the financial crisis, writing:

There has been a collapse in intergenerational mobility: in contrast to their parents and grandparents, children today in the UK as in the US have very little expectation of improving upon the condition into which they were born. The poor stay poor. Economic disadvantage for the overwhelming majority translates into ill health, missed educational opportunity, and—increasingly—the familiar symptoms of depression: alcoholism, obesity, gambling, and minor criminality.

In other words, there’s evidence that rising inequality and many other intractable social problems are related. Not only is rising inequality bad for business, it’s bad for society, too.