In an op-ed in the Wall Street Journal two consecutive commissioners of the Bureau of Labor Statistics make a plea for increased funding for the agency to enable it to calculate the rate of unemployment accurately or at least reasonably so:
We were appointed as commissioner of the Bureau of Labor Statistics by Presidents Trump and Obama, respectively. We ran the BLS for a combined eight years. Today, we’re raising an alarm: Perhaps the most vital indicator the agency uses to understand our nation’s economy—the U.S. monthly unemployment rate—is in imminent danger.
Once a month (usually on the first Friday), policymakers and financial traders react to the so-called Jobs Day report from the BLS, which estimates the monthly unemployment rate. Unfortunately, this nonpartisan fact-finding agency has been underfunded for more than a decade.
The BLS needs adequate funding to help run the Current Population Survey, which feeds into Jobs Day reports. At current funding levels, cuts to the survey’s sample size in 2025 will be unavoidable. That would endanger the quality and variety of estimates the BLS can produce. As technology evolves, people are less likely to answer calls from an unfamiliar number or open the door for a stranger. When the pandemic hit, without safe ways to conduct in-person surveys, response rates dropped to all-time lows. They’re still trending down.
I found this startling:
Because BLS began publishing monthly unemployment rates for Native Americans only two years ago, policymakers didn’t know until 2022 that the unemployment rate spiked to nearly 29% for Native Americans in 2020—about double the national rate. Even now, BLS can’t break down unemployment rates for specific racial and ethnic minorities in many states due to funding shortages. Veterans, teens, seniors, people with disabilities, women of color and other population subsets won’t be perceived or appropriately supported if they disappear from future Jobs Day reports.
I used to follow the BLS’s monthly Unemployment Situation report faithfully. I gave up when I recognized how phony it is. The commissioners cite some of the reasons it’s phony but not the most important reason. The statistics reported include a “fudge factor” based on something called the “birth-death ratio” (rate at which new businesses are formed and old ones are closed) and that fudge factor is not empirically based. When the fudge factor exceeds the actual numbers being reported, the statistic becomes purely political.
Furthermore, there are two important sources for information about employment: the employers report and the household report. The employers report is derived from payrolls reported by employers while the household report is based on a survey of households. There is a notable gap between the two reports (Mish Shedlock reports on this occasionally). The greater the gap the less reliable what the BLS is reporting about unemployment.
I also wonder how one calculates the rate of unemployment in a service economy? Is a sole proprietor or sole practitioner without clients or customers employed or unemployed? They are not deemed as unemployed by the BLS. The more “gig workers” (however you characterize them) the less relevant the unemployment rate.
In a country to over 300 million people you are never going to get exact numbers on much of anything. So I think if you carry out your argument it means you shouldn’t believe data about national statistics on anything. Just make up what you want to believe. I am sure Drew will join you.
However, exact numbers are seldom that important. The trend is more important. You need to use household surveys since employee reports miss large segments of people. You can make the survey larger hoping to make it more accurate but that costs more. The birth-death survey isn’t just made up, it’s based on past numbers. It’s probably less accurate during turns in the business cycle but that’s why you use both sets of numbers.
Interesting that you claim it’s political. Any evidence of that? Other than whining?
Steve
If it is not empirical, it is based on value judgments and that is inherently political. I’m using the classical definition of politics. I don’t mean partisan politics although the extent to which partisan politics is based on relative preferences is frequently ignored.
Dave Schuler: The commissioners cite some of the reasons it’s phony but not the most important reason. The statistics reported include a “fudge factor” based on something called the “birth-death ratio” (rate at which new businesses are formed and old ones are closed) and that fudge factor is not empirically based.
The birth-death ratio is empirically based. New businesses don’t immediately report statistics, so to adjust for the lag, they use the last five years as a statistical basis utilizing an autoregressive integrated moving average (ARIMA). That the estimate is subject to error—as all estimates are—doesn’t make it “phony”.
Dave Schuler: The more “gig workers” (however you characterize them) the less relevant the unemployment rate.
BLS surveys have had difficulty measuring the gig economy, but they have implemented changes to more accurately measure that aspect of the economy.
It was once but not any longer. The rate of new company formation has been fluctuating wildly over the last 5-10 years and that is not what the BDR assumes.
Dave Schuler: The rate of new company formation has been fluctuating wildly over the last 5-10 years and that is not what the BDR assumes.
You are conflating error margins with “not empirically based” and with “phony”. More specifically:
1. Rate of new company formation is only part of the equation.
2. Lag in reporting of new company formation is only part of determining the rate of new company formation.
3. Using the last five years of data (ARIMA) to adjust the rate of new company formation isn’t perfect by any means, but provides more information than “phony” or randomization, meaning the adjusted value is likely closer to the actual value than the unadjusted number, and the unadjusted number is likely closer to the actual value than “phony” or randomization.
And as steve pointed out, “The trend is {usually} more important.”
Maybe a bigger metaquestion.
What if survey based data collection implicitly assumes a high trust society to be accurate?
And if the US is evolving away from being a high trust society? What then, how many different pieces of data we and the government uses to inform decisions would be affected?