Why are analysts crowing about our crummy job market?
Memo to the 1 percent: The economy still sucks
Friday's jobs report landed with a splat. The economy created 138,000 jobs in May, which is pretty meager. And if you compare job creation over the past year to the last six months and then three months, you see a clear deceleration.
But a bunch of mainstream economic analysts think this is actually good news: The unemployment rate fell to 4.3 percent, which suggests the economy is simply running out of people to employ. In that case, rates of job creation should naturally fall too. It's a sign we've accomplished what we set out to do — recover from the Great Recession and put America back to work.
"With the unemployment rate dropping to 4.3 percent, it really can't get much better," Brian Kropp, the HR practice leader at the consulting firm CEB, told The New York Times. Federal Reserve officials seem to be thinking the same thing and look poised to hike interest rates again in June.
So the "good news" interpretation of Friday's report is pretty widespread among people who matter.
It's also obviously wrong.
Let's begin with whether 4.3 percent unemployment is actually as good as it can get.
What the Fed is trying to guard against is rising inflation. In the first parts of a recovery, employers can simply bring in new workers who have dropped out of the labor force. But as the economy runs out of new workers, employers are forced to compete with each other for new labor. That should spur an arms race of rising wages, which — depending on how fast productivity grows — could spiral into rising prices. So the Fed worries that if unemployment falls below a certain threshold, inflation will rise faster than it can contain.
But as recently as December the Fed projected the unemployment floor to be 4.8 percent. We're now at 4.3 percent and the inflation rate still remains well below the Fed's 2 percent target. In fact, during the late 1990s boom, the unemployment rate briefly touched 4 percent, and inflation reached 2 percent but didn't rise above it. (Keep in mind, after all the damage done by the Great Recession, the Fed should tolerate inflation a percentage point or two above target for a few years.)
Now, unemployment has fallen all the way down to 2 percent before — it was during the peak of World War II mobilization — while inflation rose to 10 percent. So, say you split the difference and decided 3 percent unemployment was as low as we could go. How many jobs could we add before we hit full employment? Another 2.1 million.
Other measures, however, indicate there's even further to go.
As I've explained before, the unemployment rate only counts people who've looked for work in the last four weeks. Add them together with people who have a job, and you get the labor force participation rate. But people can be able and eager to work, but be so discouraged by the job market that they just give up and stop looking — or look less than once a month. In which case the unemployment rate would understate how much job creation we have left to do.
To say the job market has been discouraging would be a grotesque understatement. After 2008, the number of discouraged workers jumped from five out of every 100 unemployed people to eight out of every 100, and basically hasn't fallen since. And while the maximum size of the labor force is naturally declining as retirees grow as a share of the U.S. population, labor force participation is clearly lower than it should be — by anywhere from 1.3 percentage points to 2.7 percentage points.
Closing that gap would mean adding between 3.3 million to 7.1 million new jobs.
There are other indicators we can look to as well.
The prime age employment ratio, for instance, measures the percentage of Americans aged 25 to 54 who have a job. It's a useful metric because it can't be thrown off by demographic shifts, like how many people are retired or in school. It reached 81.9 percent in 2000, and there's absolutely no reason it shouldn't be able to return to that level. Yet it's only at 78.4 percent — lower than the worst part of the 2001 recession.
Then there's the rate of wage growth. Given the Fed's underlying logic for how full employment leads to wage hikes, which spirals into inflation, this is really the key indicator of full employment. Wages grew at 4 percent a year at the peak of the '90s boom. Right now, they're growing at 2.5 percent.
Fed officials sound genuinely flummoxed that inflation is staying low even as the unemployment rate keeps falling. But there's no mystery here. The unemployment rate is not always the most reliable measure of the job market's health. And the above data points reinforce that the unemployment rate paints an overly rosy picture.
Failing to recognize this is dangerous. If the Fed hikes interest rates too quickly this year, it will squash the recovery. It might seem crazy to characterize three hikes of 0.25 percent as "too quick," but it's all relative to the underlying durability of the recovery.
The economy also does not deal with everyone equally. Less-privileged Americans — people who aren't white, who are disabled, who have criminal records, etc. — face barriers, like discrimination, that mean they're the last to feel the benefits of an economic recovery. The unemployment rate for African-Americans, for instance, is typically twice the national rate, though the gap shrinks as unemployment falls. So a 3 percent unemployment rate for the nation would mean 5 percent to 6 percent unemployment for black Americans.
That makes 3 percent nationally look less like an upper bound, and more like the bare minimum of what we should expect of ourselves.
This is not a recovery that's strong and enduring. This is a recovery that's hanging on by its fingernails. The refusal of many smart and powerful people to acknowledge this is bizarre and indefensible.