Above: A screenshot of the first page of google image results for “productivity”, illustrating that no one has any idea how to illustrate the idea of “productivity.”
As I mentioned on the About page, one of the things I’m interested in in writing this blog is not only tracking the development of these new technologies themselves but also looking at their impact on the economy as a whole, because for my money that’s where things really start to get interesting. So I wanted to use this note to talk about something that I’m sure will come up again and again: The productivity paradox.
It may be obvious to us laymen that any machine smart enough to beat Ken Jennings at Jeopardy could probably come for a few of our gigs as well. But the possibility that all this shiny new tech that we’ve seen coming at us in the past few years — smartphones, voice recognition, deep machine learning, bots — could actually be taking over human labor is one that a lot of mainstream economists have greeted with skepticism. (See, for example, this recent post by Kevin Cashman at the Center for Economic and Policy Research.)
The cause of their skepticism comes down to one word: Productivity, that is, how much economically valuable stuff a worker/firm/economy is able to produce over a given amount of time. If a widget factory installs a machine which enables the workers on its assembly line to produce 10% more widgets a day, those workers have become more productive. (Say they go from 10,000 widgets to 11,000 widgets, while employing 100 workers: So, 110 widgets per worker.)
And if a widget factory installs a machine which allows the factory to produce the same amount of widgets per day but using 10% percent fewer workers, well, tough luck on the guys who got laid off, but the remaining ones are also more productive. (10,000 widgets produced by 90 workers instead of 100, or 111 widgets per worker.)
In the economy as a whole, then, if smart machines were taking over for labor, we might see greater unemployment. But we should also see increasing productivity, as fewer and fewer workers could produce the same amount of economically valuable stuff.
But, we don’t. At least not over the past few years. And that’s weird. Take, for example, this chart about increases in U.S. manufacturing productivity:
Even though bots these days are wandering pretty far from the factory floor, manufacturing is still the sector of the economy which has the most experience in integrating robots and automation, and if we were going to see huge productivity increases from the adaption of these technologies in the past half-decade or so, you’d think we’d see it there first. But no soap. Compared to the 1990s, back when the internet was getting invented and Excel was taking over, the gains in productivity in recent years have been much lower.
Even worse, while historically gains in productivity have marched step by step with gains for workers, over the past couple decades we’ve seen these two factors begin to split apart, with productivity marching on (albeit slower than in the past), while wages and employment have stayed stuck.
There’s a bunch of competing explanations out there for this phenomena, some of which kind of fit together, some of which don’t: Robert Gordon has suggested that as cool as iPhones and Facebook are and whatnot, they’re not as actually useful to the economy as say, cars, or jet planes, or hell, indoor plumbing and electricity, and so maybe the economy is just doomed to grow a lot slower for the forseeable future. Larry Summers of Harvard has advanced the idea of “secular stagnation” — which suggests* that maybe companies just aren’t seeing many new widget machines out there to buy than can give them a 10% improvement to their assembly lines, and so they’re hoarding their money (or worse, investing it in shaky schemes which help create bubbles), and the government can’t do much to make them spend it.
Other take a more optimistic view, suggesting that perhaps the effects of these new technologies simply take time to ripple outward into the economy in ways that boost productivity. In their book The Second Machine Age, Erik Brynjolfsson and Andrew McAfee relate an anecdote about the introduction of more efficient electric engines, replacing steam, in factories in the early 20th century. The engines themselves were more efficient, which helped boost productivity some. But the real gains came when people realised that using electric engines allowed them to rearrange the layout of the factories themselves — with steam engines, the most power-hungry machines had to be closest to the engine, as the amount of power a steam engine could deliver rapidly decreased over distance. But electric engines could deliver just as much power across the factory floor as they could nearby, allowing assembly lines to be redesigned into more efficient layouts and larger factories to be constructed, innovations which improved productivity much more than merely the new engines themselves. Indeed, a new article out today from the Centre for Economic Performance suggests that may be the case (full article here; recap for non-economists at the Harvard Business Journal’s blog.)
One of the other explanations for the persistently sluggish economy of the past several years and stuff like the productivity paradox and the decoupling of productivity and employment is that, uh, maybe economists just don’t understand the macroeconomy as well as they thought they did. And so they’re going to have a hell of a time figuring out what the impact of truly revolutionary technologies will be.
 In part, it’s a lot more complicated than all this. For a really full discussion — book length, in fact — of secular stagnation and what might be causing it, see here.