America's economy is doing much better than you think
What if we're measuring GDP all wrong?
You could be forgiven for thinking that America's disappointing economic recovery is even worse than we thought.
After all, GDP growth, adjusted for inflation, increased at an annual average of just 2 percent over the past two years, according to revised government figures released Thursday. That's down from the prior 2.3 percent estimate, which itself was pretty lousy. And so far this year, growth has been even weaker, just a 1.5 percent pace.
By comparison, average GDP growth since World War II has been just over 3 percent, and 4 percent for the average post-1960 recovery. No wonder some economists think the Great Recession has given way to the Great Stagnation.
Unless it hasn't. What if things are actually a lot better than we think — or at least better than GDP figures suggest?
Think about it: Month after month, the economy is generating about a quarter million net new jobs. The unemployment rate is close to 5 percent. Corporate profit margins are at record highs, with stock values not far behind. And Silicon Valley is on fire. A new TechCrunch analysis finds that the number of unicorns — technology startups valued at over $1 billion — has more than doubled since 2013. Europe would love to have a "stagnant" economy like America's.
So why then do the all-important GDP numbers — the broadest measures of economic activity — show a perpetual funk? As The Week's Ryan Cooper explained earlier this week, measured U.S. productivity growth has been terrible during the recovery. And if output per worker isn't rising much, if at all, GDP growth is bound to be weak, too. Even worse, productivity growth has been subpar since 2004, giving little sign that the supposed IT revolution — apps, big data, digital content, social networks — is having much economic impact.
It's a puzzle for which Goldman Sachs has a simple answer: We are measuring productivity wrong, and therefore we are measuring GDP wrong. A metric devised for America's 1930s "steel-and-wheat" economy, in the words of economic historian Joel Mokyr, doesn't work so well for a rapidly growing digital economy. In a recent report, Goldman economist Jan Hatzius and Kris Dawsey note that prices of tech hardware — adjusted for quality improvements — have fallen a lot faster than those for software. This suggests software isn't improving much.
But Goldman thinks this gap is a "statistical mirage" reflecting the "amorphous" nature of software improvements. Hatzius and Dawsey ask: "How much better are the inventory management systems that retail companies contract out or develop for their own account compared with those of 20 years ago? How much better is Grand Theft Auto V than Grand Theft Auto IV? And how much more value do we now derive from our internet connection compared with a decade ago?"
As the Goldman economists reckon, then, U.S. inflation is lower than we think due to sharply falling, "quality adjusted" IT hardware and software prices — and thus real economic growth and productivity are higher. GDP growth might actually be close to 3 percent right now, which would be more in sync with what's happening in labor markets and the tech sector. Oh, and it also means real incomes are growing faster than we think, which is why the economists are "skeptical of confident pronouncements" that American living standards aren't improving as fast as they used to. By the way, new analysis by the Peterson Institute suggests worker incomes have pretty much been keeping up with productivity gains. So perhaps more good news for the 99 percent.
Now, even if Goldman's analysis is dead on, it doesn't mean policymakers should sit on their hands. For the U.S. economy to grow as fast in the future as it did in the past, it will require even faster productivity growth to offset slowing population growth. That means deep reform of our tax, regulatory, and education systems. It means more public investment in infrastructure and science. And a lot more of those unicorns.