Normally you’d be better off looking at how someone like James Hamilton evaluates economic data, but I’ll take a stab at this one. In early 2008 when we were in danger of slipping into recession, I got interested in the unemployment rate as a leading indicator (it is usually viewed as a lagging indicator.) I asked myself how much of an increase in unemployment (during economic expansions) would be needed to indicate something more than just a blip in the recovery and that a new recession was underway.
I noticed that during expansions the unemployment rate never seemed to rise more than 0.6%, before falling again. That is, unless we were actually going into a recession. In that case unemployment rose much more, indeed at least 1.9% (in the mild 1980 recession.) In between was a sort of donut hole, with almost no increases in unemployment of more than 0.6% that did not mean a recession was underway. (Thus the US doesn’t have mini-recessions, even though all our economic theories predict we should see them more often than regular recessions.) I say almost, because there was one exception; unemployment rose 0.8% during the 1959 nationwide steel strike, without any recession. But the US economy is now far more diversified, and such an event is now very unlikely. The share of employment in unionized industries like autos and steel has fallen sharply. Here’s what I infer:
- Increases in unemployment of more than 0.6% are almost always indicators of recession.
- Increases in unemployment of more than 0.6% are ALWAYS economically significant.
The second statement is slightly more definitive, because I consider the 1959 steel strike to be an economically meaningful event.
If I’m right, then any change in the unemployment rate of more than 0.6% over a relatively short period of time, probably indicates that the actual unemployment rate changed—that it wasn’t all statistical noise.
Between November 2010 and January2011 the unemployment rate fell from 9.8% to 9.0%. Let me emphasize that I strongly believe some of this was noise in the data, perhaps as much as 0.6% of the fall. Thus the actual rate may have fallen from say 9.5% to 9.3%. But I don’t think it was all noise, otherwise we would have seen previous episodes where the unemployment rate ticked up by more than 0.6% without triggering a recession. And over the past 63 years (more than 750 months) we just don’t see meaningless blips that large.
Even an actual drop of 0.1% per month would be significant, that is faster than we had in 2010, and even faster than the pace of recovery that most forecasters expect over the next 5 years. And remember that my argument suggests that 0.2% is the minimum plausible estimate for how much unemployment has actually improved; it may be a bit more.
I think this is why bond prices fell and yields rose. The bond market knows that the payroll numbers are usually more reliable. But they also know that both numbers are subject to error, especially with snowstorms disrupting data processing at some firms. I actually know someone who expects to soon get a job, but the actual hiring has been held up by the recent storms in Boston, which have created a work backlog. I think the bond market understands that while the payroll number is usually better, a change in unemployment that large is almost always economically significant.
BTW, my recession indicator worked pretty well in the 2008 recession. By December 2007 unemployment was up 0.6% from the low, and in March 2008 it was up 0.7%. Yes, the recession had already begun by March, but as late as mid-2008 some prominent forecasters still didn’t expect an outright recession.