Friday, March 31, 2017

Robuts takin' jerbs

One advantage of writing down models in math, even if you can't test them, is that you make the ideas concrete. For example, take a recent exchange between Ryan Avent and Paul Krugman. Avent is trying to explain robots could be taking jobs even while productivity is slowing down. Lots of people have said made some variant of the argument: "If robots are taking our jobs, how come productivity growth is so slow?" Here's Avent's theory:

The digital revolution has created an enormous rise in the amount of effective labour available to firms. It has created an abundance of labour...

How does automation contribute to this abundance of labour?...[T]here’s a...straightforward and important way in which automation adds to abundance now. When a machine displaces a person, the person doesn’t immediately cease to be in the labour force...

In some cases workers can transition easily from the job from which they’ve been displaced into another. But often that isn’t possible...Such workers find themselves competing for work with many other people with modest skill levels, and with technology: adding to the abundance of labour.

Second, the abundance of labour, and downward pressure on wages, reduces the incentive to invest in new labour-saving technologies...

Third, the abundance of labour destroys worker bargaining power...

Fourth, low wages and a falling labour share lead to a misfiring macroeconomy...

So there you are: continued high levels of employment with weak growth in wages and productivity is not evidence of disappointing technological progress; it is what you’d expect to see if technological progress were occurring rapidly in a world where thin safety nets mean that dropping out of the labour force leads to a life of poverty.

Some pieces of this I get, others I don't. Reduced bargaining power explains the wage stagnation piece, but doesn't explain how wage stagnation goes along with slow productivity growth.

The macroeconomic piece makes sense if you believe in some form of Verdoorn's Law, but it seems like it would be self-limiting - if faster technological progress led to downward pressure on wages which led to deeper recessions which led to slower productivity growth, then recessions would slow down robotization and allow wages to rise again (maybe that's what's happening now!).

The same is true of the long-term innovation argument. If fast technological progress in robots displaces workers, lowers wages, and reduces the incentive for more robot innovation, productivity will then slow and workers will catch a break.

The first part of the argument is the most interesting, but also the hardest to understand when written in words. Is this just Baumol Cost Disease? If so, why is it happening now when it didn't happen before?

Fortunately, Paul Krugman came in and formalized this part of Avent's argument. I'm not sure Krugman exactly captured what Avent was trying to say, but Krugman's model is simple and interesting in its own right. It's a powerful argument for the "models as simple formalized thought devices" approach to econ that I sometimes pooh-pooh.

Krugman's model is of an economy with 1 good and 2 production processes - one capital-intensive, one labor-intensive:

You can make stuff with more machines with technique A, or with more people with technique B.

Now suppose there's gradual progress in technique A, but not B - it gets cheaper to make things with a lot of robots, but not cheaper to make them with a lot of humans. As Krugman shows, this will lead to falling wages:

It will also lead to more workers going into the labor-intensive B sector, which is the kind of shift Avent seems to be talking about.

OK, so this explains how robots replacing humans could be consistent with both slow productivity growth and slow or negative wage growth, But what it doesn't (yet) explain is the change in the two trends - how faster robotization could have led to both slowing productivity growth and slowing wage growth at the same time. Note in Krugman's model that if technological progress in technique A speeds up, then wage growth slows down (actually, gets more negative), but overall productivity growth in the economy speeds up.

In other words, a productivity speedup in the "robot" sector can't cause overall economy-wide productivity growth to go down in this model. But what can? Answer: A slowdown in productivity growth in technique B.

Suppose productivity in A and B are growing at the same rate initially, so that the isoquant is just sliding in toward the origin. Wages and productivity are both going up. Then B suddenly stops getting better, but A keeps on getting better. Now economy-wide productivity slows down, and wage growth slows down or goes negative.

So in this model it wouldn't be better robots that made economy-wide productivity and wages go down. It would be a slowdown in the technologies that allowed humans to compete with the robots.

This would be hard to verify empirically, since identifying human-complementing productivity growth and human-substituting productivity growth will require some major theoretical assumptions. But just casually glancing at the data, we can see that there are other productivity divergences in the economy that might be roughly analogous. For example, durables TFP growth has increased faster than nondurables TFP growth since the early 70s:

So if we think durables production is a lot more capital-intensive than nondurables production (which includes a lot of services), this could be a sign that a slowdown in labor-intensive production processes is generally underway (maybe as a result of expensive energy? top-out of education? some government policy?).

Anyway, so I think in order to make Avent's thesis work in Krugman's model - in order to make robotization be the cause of both slowing productivity growth and of wage stagnation - there has to be a slowdown in non-robot technology going on.

Interestingly, while writing this, I thought of a way that the Avent thesis could be combined with the "industrial concentration" hypothesis of Autor et al. Autor et al. cast doubt on the "robotization" hypothesis by observing that labor's share of income isn't increasing within firms:

Since changes in relative factor prices tend to be similar across firms, lower relative equipment prices should lead to greater capital adoption and falling labor shares in all firms. In Autor et al. (2017) we find the opposite: the unweighted mean labor share across firms has not increased much since 1982. Thus, the average firm shows little decline in its labor share. To explain the decline in the aggregate labor share, one must study the reallocation of activity among heterogeneous firms toward firms with low and declining labor shares.

Autor et al. suggest that a few "superstar" firms are increasingly dominating their industries, causing profits to rise even as increased monopoly power shrinks markets and reduces measured productivity.

So how could this be reconciled with the Avent "robotization" theory? Well, take a look at Krugman's model. And now imagine that the "superstar" firms are the firms that use technique A, while the laggard firms use B. Krugman's theory doesn't include monopolistic competition, but it's easy to imagine that A, the capital-intensive technique, might have economies of scale that make A-using firms bigger and fewer than B-using firms. So a slowdown in progress in B would lead to a shift of resources toward the A firms, causing increased industrial concentration and a lower overall labor share without affecting the labor share within each firm - exactly what Autor finds. And it would do this while also causing economy-wide productivity to slow down and wages to stagnate.

That's neat. But it still means that, fundamentally, a technology slowdown rather than a speedup would be the root cause of the economy's problems - not a rise of the robots, but a world where robots are the only thing still rising.

18 comments:

Wages go down which would affect aggregate demand except that State intervenes to prop up demand through low interest rates that inflate asset prices creating 'wealth effect' to mop up excess supply. Rich get richer; poor get poorer and economists draw irrelevant diagrams to try to figure out what's happening.

No, I don't think so. Because curves are clearly labeled, it's very clear what they're *not* including (or abstracting away from). If you want to include things like non-rational choice, institutions, and power, you can draw curves for those, if you like.

Imagine extremely low productivity sector C which is just not done if wages are above a threshold.(Avent's 5th landscaper for a tech magnate)

And even lower productivity sector D which hadn't even been considered until wages were at any even lower threshold and would never be automated because it's of no value other than signalling and a tech solution wouldn't signal.(Lewis Black's ball washer https://www.youtube.com/watch?v=5T8Gxk7vbec)

I haven't studied much econ since the the mid-2000's. I'm confused by what you mean by slowing productivity. Do you mean output per worker? Because wouldn't the type A firms be highly productive on a per worker basis? Or is it partially a measurement problem like google searches aren't calculated in GDP?

I am hopeful that you'll respond to one simple question about the "productivity paradox". It's very unrelated to your hypothesis, which I may or may not grok - it's an alternative explanation for the combination of:

To me, this is very simply explained by the effects of automation on demand by "The Lights in the Tunnel" by Martin Ford. Quite simply:

1. Firms lower the cost of their products by a fraction by eliminating workers and replacing with robots/poorer workers in China. The difference is passed to investors.

2. Former workers receive less income, buy fewer things. Furthermore, the increased income for the investor circulates less than the lost income did for the worker. Result: decreased net consumption, at least in first-world countries.

3. Reduced consumption leads to... reduced production! We had a giant recession that, sure, was nominally caused by the financial crisis, but that could also be seen as a reversion to a mean of lower growth, i.e. lower production. The point is that lower consumption almost has to create lower production.

4. If net demand and consumption fall as fast as you automate, then you never see any gain in output per worker. You're constantly decreasing the amount of workers you need to produce a constantly decreasing supply of goods. Productivity is flat.

This seems particularly obvious to me in a world where consumption is highly digital and depends almost *entirely* on demand, i.e. disposable income.

In short, high productivity growth isn't even possible in practical terms without rising demand, which isn't possible in neutral credit terms without rising wages or liquid compensation. Roughly.

Ryan Avent briefly mentions this part in a paragraph here:https://medium.com/@ryanavent_93844/the-productivity-paradox-aaf05e5e4aad

Buyt why doesn't this completely address the "puzzle"? Why isn't this the whole story? Why do we need other explanations at all?

This is the kind of thing that economists typically use "general equilibrium" to think about.

K, so suppose automation & globalization allow us to produce more with less. Demand falls, so we produce less. BUT, since we can now produce more with less, the INPUTS we use to produce stuff must fall even more than the OUTPUT falls. So measured productivity should still go up, even if GDP slows down.

Okay, I follow this argument. But I'm not sure that the hypothetical is as true in practice as it is easy to say? For example, we know that wages are 'sticky', and employment - and choices about level of employment in a firm - may also be sticky. I.e., there may be a finite amount of tolerance for the second stage of input reduction.

Specifically, here: the INPUTS we use to produce stuff must fall even more than the OUTPUT falls..

This feels like a model assumption to me than something empirical - an assumption that all firms choose to operate at the maximum productivity frontier at all times.

After all, shadow of the future exists. It seems equally plausible that this alternative happens:

1) Firm automates, reduces headcount by X1 over a five-year period, expecting static output2) Many other firms also do X over same period, so that net consumption falls3) Production falls in correspondance with consumption, wiping out productivity gains4) Employers, already highly profitable even at reduced production frontier, choose not to further reduce headcount for many reasons. They're sick of layoffs; they want to wait for a recovery; they don't know how to reduce headcount any further without suffering quality risks that undermine market share, etc, etc.

The whole concept seems to fundamentally misunderstand the relationship between labor and output. Any given automation allows you to reduce headcount by a fixed amount, to some extent irrespective of the quantity of your output. (Not completely, but you could say it permanently lowers the ratio of labor to output). But while reducing your first 10% in labor may be easy thank to automation, the second 20% may be completely impossible at any level of demand.

Basically, I have no idea why the inputs to production fall a second time in response to the output falling. It seems to assume a completely linear feasibility of input reduction along the entire frontier of firm labor. I'm pretty sure there's a lot of rigidities in there.

I'm very strongly convinced that rising productivity, as measured, is strongly empirically coordinated with overall consumption growth. It would be something to look into, wouldn't it?

Taking this argument even further -- if the output of technique A improves at a much faster rate than that of technique B, wouldn't that ultimately reduce the amount of effort going into improving output for technique B?

Take journalism for example. We could work on an AI that can generate article on its own. Or we could have writers work harder and crank out articles non-stop 24 hours for 365 days a year. Labor would lose out unless there's some way to augment the production (for example, a technology that can automatically generate articles based on neural activities).

But if AI writer is a realistic possibility, why bother investing in technology to help the output by labor? At that point, won't the role of labor be reduced to a stopgap until the robots get perfected? Especially when robots don't require wages like workers do.

So is it possible to reach a point where capital intensive production technology will always beat out labor intensive production technology? If so, how close are we to that point?

Sometime back in the 60's a friend of mine was on a mission studying the Brazilian use of concrete. He found that the cost of a poured yard of concrete was the same as in the U.S. The cost of concrete workers in Brazil was somewhere around 25 cents and hour. I don't remember U.S. rates at that time but it must have been near or greater than $2.50 per hr. Brazil had and still has lots of undifferentiated labor.

Another story. A commenter at CR's blog was involved in dealing with a Chinese supplier of a cast part. It required precision machining to be finished. He recommended using a precision casting that would not require additional machining. The cost of the precision die was around $35k. The supplier laughed as his cost for the extra machining was 5 cents.

I think in one sense a company can be operating on both of Krugman's curves. As a supply chain executive, half of my change initiatives were capital investment / process improvement activities that increased productivity: things like automated picking, more efficient consolidation algorithms, quality training.

The remaining initiatives were labor arbitrage: outsourcing delivery, bringing in temps for spikes instead of carrying more employees, shifting manufacturing to lower cost labor regions. Even though I knew in all these cases we were losing productivity per hour, the lower labor rates more than made up for it.

In my distribution center, robots took jobs as we invested, automated and improved processes. In my delivery network and manufacturing, lower wage workers took jobs from higher wage workers, and the productivity loss was more than covered by the wage difference.

Why did productivity decline from the 80s on? Innovation (and regulatory and societal acceptance) of labor arbitrage was a big part of the answer where I worked. It will be interesting to see if the $15 minimum wage experiment really does spur fast food to invest in capital improvements to increase labor productivity. Owners have been threatening that is the only way they can pay those wages and still make an acceptable return. Sounds less like a threat and more like a virtuous cycle to me.

A) Investment fell to all-time lows during the recession: https://fred.stlouisfed.org/series/A006RE1Q156NBEA

and

B) Educational attainment has stalled.

Seems to me that is better theory.

And why would companies not be investing as much, especially in an era when capital is plentiful and cheap? One theory: market concentration / oligopolies. A really interesting study would be looking at productivity growth of concentrated vs. non-concentrated industries.

By the way, the model you just made provides a great alternative to the false theory of comparative advantage... you see all the capital flowing to the leader "A" firms and away from the "B" firms... in every industry.