Statisticians fear Trump White House will manipulate figures to fitnarrative | US news | The Guardian
Experts, including former chief statistician of the US and outgoing head of Bureau of Labor Statistics, see threats to system of public, accurate data
US statisticians are concerned that Donald Trump’s administration might suppress or manipulate public statistics that don’t fit his narrative of the truth, the Guardian has learned. In a series of interviews, individuals who have recently left high-level positions at federal statistical agencies expressed worry that the administration may stop collecting and publishing data on subjects such as abortion, racial inequality and poverty.
“We should all be starting from the same numbers. I think that’s a fear that many of us have at this point – it’s that picking and choosing your numbers to suit your politics is not the way that we ought to be doing it,” said Katherine Wallman, chief statistician of the United States from 1992 to 3 January this year.
Wallman, like other statisticians the Guardian spoke to, believes that a number system which consists of accurate, publicly available government data is currently under threat.
In a press conference on Monday, the White House spokesman, Sean Spicer, told journalists: “The president, he’s not focused on statistics as much as he is on whether or not the American people are doing better as a whole.” But without statistics, measuring how “the American people are doing” is simply a matter of opinion.
It’s not the first time the president’s team has questioned public data. In August, the then presidential candidate described the Bureau of Labor Statistics (BLS) unemployment numbers as “phoney”, claiming: “The 5% figure is one of the biggest hoaxes in American modern politics.” In the same speech, Trump suggested alternative data, adding: “The number’s probably 28, 29, as high as 35. In fact, I even heard recently 42%.”
Speaking on the phone in the first days of her retirement, Wallman explained that many new administrations “don’t necessarily understand the autonomy of the statistical agencies”. Other statisticians have described a range of ways in which public data might be influenced by the Trump administration.
The most frequently cited is simply the defunding of specific statistical programs. This is already under way. Two Republican-sponsored bills are attempting to nullify a 2015 housing regulation aimed at addressing racial segregation in cities such as Baltimore and Chicago, simply by halting data collection. The new bills, which together are titled the Local Zoning Decisions Protection Act of 2017, state that “no Federal funds may be used to design, build, maintain, utilize, or provide access to a Federal database of geospatial information on community racial disparities”.
Defunding public data is also an effective tool in silencing activists. As a former Census Bureau employee explained: “You can’t talk about discrimination if there’s no data there to support it.” The employee, like most of the statisticians the Guardian spoke with, asked not to be named for fear that they would not be able to work for federal agencies in the future.
But many statisticians believe that economic indicators are likely to be protected from possible manipulation by the administration because the business community relies so heavily on accurate numbers. Indeed, Erica Groshen, the outgoing head of the BLS, expressed the most optimism of any of the statisticians interviewed.
Groshen had already handed in her government ID and cleared out her desk when she spoke to the Guardian on Friday. As the only political appointee in the BLS, it was Groshen’s last day in her four-year job. When asked about some of the risks cited by other statisticians – the increasing role of the private sector in public data collection, changes in methodologies which could make numbers appear more favorable – Groshen was confident in repeatedly asserting “the BLS is independent of the administration”.
When asked if she had spoken to Trump’s new administration, Groshen said: “We have had a number of conversations with him and them in different venues and they have asked questions.” She continued: “One I remembered was whether we were part of the normal budgetary process. Then there were other questions about what the release schedule looked like.”
Groshen said her biggest worry is about cuts to the department: “We had a major cut in 2013; we hadn’t been doing very well [financially] before that.” This year, the department expects to be underfunded by about $30m on a budget of $609m.
Current coping mechanisms for dealing with budget shortfalls are dangerous, according to Groshen. “You can keep positions unfilled for a while but when you do that, you risk two things: one is obviously irrelevance because you’re not keeping up with the times and the other is you’re raising the risk of some kind of operational failure.” Isaac Shapiro, a fellow at the nonpartisan Center on Budget and Policy Priorities, said data such as the supplemental poverty measure, used to assess safety net programs, is among the statistics most at risk of inaccuracy if budget cuts cause staff to take cheap shortcuts in data collection.
Funding isn’t the only concern, though. One economist who worked at a federal statistical agency from 2009 until 2016, and who also asked not to be named, explained: “The administration can and probably will start adding onerous requirements for vetting before information is released to the public.”
Wallman has thoughts about what that vetting could look like. In her 25 years as chief statistician, Wallman said she was not aware of efforts by politicians to change the data, “but then there’s the interpretation and presentation of the data and I am aware of occasions where policy folks have thought it was appropriate to change the things that were featured in a press release, or take out specific bullets that they thought were unattractive, or change the timing of the release because it might be inconvenient in terms of a policy that a cabinet official wished to announce.” Wallman added: “I am concerned about that over the next week.”
There are also concerns about the increasing involvement of the private sector in public data collection. Kenneth Prewitt, former director of the US Census Bureau and currently professor of public affairs at Columbia University, posited a scenario in which a private company was responsible for population counts and its methodology was not publicly available. Then, Prewitt explains, it’s possible that the company would use less rigorous methods that could “count some people twice and others not at all. And if you can leave people out of the census selectively, you can actually affect, eventually, the drawing of legislative boundaries at the local level, the state level and of course at the federal level.”
According to Prewitt, there are simple ways to make adjustments to the way data is calculated that affects the final numbers.
“What people do not understand,” he explained: “If you control the denominator, you control everything.”
Denominators are used in virtually every single public statistic. For example, the Census Bureau publishes data which shows that one in four US Hispanics lives in poverty (specifically, 11.2 million out of 48.2 million Hispanics in the US). To change that statistic, you can either change people’s lives or, more simply, you can change the way that you count who is and who is not Hispanic – then, the statistic can become one in three, one in 10, one in whatever.
As a former Census Bureau statistician explained, once you change the statistics, “you can write your own narrative. You can tell people how sick they are or how safe they are.”