The stability of the federal government’s system for producing statistics, which the United States relies on to understand its people and economy, is at risk because of budgetary concerns, officials and data users anticipate.
And it’s primarily focused on the new Trump administration and promises from Republican lawmakers to cut public spending, which could further affect data production.
In recent months, budget deficits and short-term funding restrictions have led to the termination of some data sets by the Bureau of Economic Analysis, known for its tracking of gross domestic product, and proposals by the Bureau of Labor Statistics to reduce The number of participants surveyed to produce the monthly jobs report. A “multi-year lack of funding” has also hurt efforts to modernize software and other technology the BLS needs to properly publish its data, concluded a report by a panel of experts tasked with reviewing several botched data releases. ‘last year.
Long-term funding issues are also straining the Census Bureau, which conducts many of the federal government’s surveys and is preparing for the 2030 head count that is expected to be used to redistribute political representation and thousands of billion in public funding across the country. Some census watchers are concerned about budgetary issues forcing the bureau to cancel some of its field tests for the upcoming count, as it did with 2020 census tests to improve counts in Spanish-speaking communities , rural areas and on indigenous reserves.
Although the statistical agencies were not named specifically, some advocates worry that calls to reduce the federal government’s workforce by President Trump and the new Republican-controlled Congress could jeopardize the integrity of the nation’s data more at risk.
“We’re getting to the bone now,” says Erica Groshen, a former BLS commissioner appointed by former President Barack Obama. “So even if (the funding situation is) exactly the same, the impact is going to be worse” due to continued challenges with producing reliable data.
Why today’s government data is like “crumbling infrastructure”
Like roads and bridges, the federal statistical system is indispensable but generally neglected, according to its supporters. Groshen likens its current state to “crumbling infrastructure” that still does its job but with “visible cracks.”
“You always fill the potholes on top, but you don’t fight back,” says Groshen, now a senior economic adviser at the Cornell University School of Industrial and Labor Relations. “You don’t burst the undervoltage of the bridge. You don’t develop the new bridge that is to replace the old bridge when you have discovered that its life is about to end.”
Although advocates and officials say government data remains reliable for now, they indicate troubling conditions ahead.
“The economy isn’t getting simpler to measure, right? Things are getting more complex. You know, there are a lot of new things that we have to learn to measure,” said Vipin Arora, the director of BEA, during a meeting last month of The Council of Professional Associations on Federal Statistics.
Statistical agencies also face a crisis facing the broader survey and polling industry – a shrinking rate of people willing to answer questions.
To counter survey response rates, statistical agencies have experimented with using more existing government datasets and other administrative records to help take stock of the country’s population and economy. But it’s a time- and money-consuming process for agencies to research and ensure the quality of government statistics isn’t compromised, says Nancy Potok, former chief statistician for the Office of Management and White House Budget Director, who previously served as a Deputy Director at the Census Bureau.
“Without the money, they are sort of stuck in the old model, which is becoming more and more expensive and less viable. And this will ultimately affect the quality of the statistics,” Potok warns.
Potok says she is currently working on an update to an American Statistical Association report released last year to sound the alarm about the risks facing the country’s data. This report concluded that the main threats to statistical agencies include declining public participation in surveys, not enough laws to help protect the integrity of data against political interference and neglect of data appropriators. Congress.
“What we found was very concerning because agencies overall had lost about 14% of their purchasing power over the last 15 years. And the rest of what’s called non-defense discretionary spending increased by 16% at the same time,” says Potok. “And yet, the mandates and workload and challenges of federal statistical agencies have increased significantly over the same period.”
Why advocates see a “wise investment” in funding government data
With the upcoming government shutdown deadline in March, Potok says she sees an opportunity to make an argument for more support for the statistical system.
“If you’re really looking to cut the federal budget, you don’t want to cut the things that work. You want to cut the things that don’t work,” Potok says. “And it’s not a huge investment relative to the size of the federal government to put money into these agencies to be able to provide this information. It’s actually a wise investment.”
William Beach, a former labor statistics commissioner who was appointed during the first Trump administration, agrees.
“The statistical system does not need more money. It needs modernization, the surveys are separating themselves in particular. And if we did that, over the years we would probably spend less money on the statistical system and get a better product,” says Beach, who is now a senior economics fellow at the Economic Policy Innovation Center, a conservative think tank.
How the 2030 Census and Monthly Jobs Report Could Be Affected
For now, many statistical agency heads still face difficult choices.
Some census observers are wary of how the temporary hiring freeze that Trump ordered could affect the next phase of work for a major 2030 census test, involving thousands of temporary workers, coming next year .
Terri Ann Lowenthal, a census consultant who served as chief of staff of the former House Oversight Subcommittee for the Chief Count, said the hiring freeze “could significantly disrupt” preparations for the test, which is designed in part to help the bureau improve its counts of people of color, young children, renters and other historically undercounted populations.
“A census test, like the census itself, must be conducted on a strict schedule,” Lowenthal explains in a statement. “Failure to test new methods and operations that could contain costs and improve accuracy could put a successful census – one that counts all communities equally – at risk.”
The Bureau’s public information office declined NPR’s interview request and did not respond to a written question about the impact of the hiring freeze.
Users of economic data like Algernon Austin, director of race and economic justice at the Center for Economic and Policy Research, a left-leaning think tank, worry about changes that may happen to the sample size for The current population survey, which produces monthly employment data.
“If we really want to address racial equity issues, we really need larger samples, not smaller samples,” Austin says, noting that having fewer people participating in the survey makes difficult, if not impossible, to publish detailed statistics broken down by Race and geography.
If the government were to relax the already limited demographic breakdowns in employment figures, Austin says researchers like him would have to rush.
“We may, with considerable effort, be able to do just a small piece of work that needs to be done, but just have to throw our hands up and say, ‘We don’t know what’s going on in this state or this area metropolitan because we don’t have reliable data,” adds Austin.
Edited by Benjamin Swasey