Disease modelers use math to try to provide a more precise picture of a certain situation or to predict how the situation will change, and have become critical in the world of infectious diseases. But the accuracy — or inaccuracy — of such models is increasingly a talking point.
ATLANTA — Last fall, when Martin Meltzer calculated that 1.4 million people might contract Ebola in West Africa, the world paid attention.
This was, he said, a worst-case scenario. But Meltzer is the most famous disease modeler for the nation’s pre-eminent public-health agency, the Centers for Disease Control and Prevention (CDC). His estimate was promoted at high-level international meetings. It rallied nations to step up their efforts to fight the disease.
But the estimate proved to be off.
Way, way off.
Most Read Nation & World Stories
- A small bookstore pondered its future after a day without a sale. After a tweet, it became overwhelmed with orders.
- Republicans in super-liberal Oregon are so fed up they want to become part of Idaho
- President Trump goes on clemency spree, and the list is long VIEW
- Federal judges call emergency meeting in wake of Barr's intervention in Stone case
- A hunt for clues in Hawaii after a tourist couple falls ill with coronavirus
Like, 65 times worse than what ended up happening.
Some were not surprised. Meltzer has a lot of critics who say he and his CDC colleagues have a habit of willfully ignoring the complexities of disease outbreaks, resulting in estimates that overdramatize how bad an outbreak could get — estimates that may be skewed by politics. They say Meltzer and company also overestimate how much vaccine is needed and how beneficial it has been.
Overblown estimates can result in unnecessary government spending, they say, and may further erode trust in an agency that recently has seen its sterling reputation decline.
“Once we cry wolf, and our dire predictions turn out not to be the case, people lose confidence in public health,” said Aaron King, a University of Michigan researcher who in a recent journal article took Meltzer and others to task for making what he called avoidable mistakes.
Meltzer, 56, is unbowed. “I am not sorry,” he said.
He dismisses his peers’ more complicated calculations as out of touch with political necessities, telling a story about President Johnson in the 1960s. Johnson was listening to an economist talk about the uncertainty in his forecast and the reason a range of estimates made more sense than one specific figure. Johnson was unconvinced.
“Ranges are for cattle,” Johnson said, according to legend. “Give me a number.”
“All models are wrong”
What Meltzer does is not glamorous. He and others use mathematical calculations to try to provide a more precise picture of a certain situation, or to predict how the situation will change. They write equations on chalkboards, have small meetings to debate which data to use and sit at computers. Meltzer spends a lot of time with Excel spreadsheets.
But modelers have become critical in the world of infectious diseases.
Top CDC officials came to Meltzer last summer, when the epidemic was spiraling out of control and international health officials were trying to build a response. Meltzer was asked to project how bad things could get if nothing was done and to estimate how stepped-up aid could bend the curve of the epidemic.
Meltzer and his colleagues created a spreadsheet tool that projected uninterrupted exponential growth in two countries, Liberia and Sierra Leone.
His prediction — published last September — warned that West Africa could be on track to see 500,000 to 1.4 million Ebola cases within a few months if the world sat on its hands and let the epidemic blaze.
About 21,000 cases materialized by mid-January — a terrible toll, but just a tiny fraction of the caseload Meltzer and his CDC colleagues warned about.
No modeler claims to be 100 percent correct. Indeed, modelers have a saying: “All models are wrong, but some are useful.”
Did Meltzer blow it? Many say no. He and his colleagues clearly stated they were providing a worst-case scenario of how bad things could get. They also predicted a far lower number of cases if more help was sent, which was happening when the model estimates were released.
But the worst-case figures got the most attention. The media focused on them in headlines. Health officials highlighted them in their push to get more money and staffing devoted to the epidemic. And interestingly, those are the numbers health officials describe as the most successful part of Meltzer’s prediction paper.
“I think it galvanized countries — and people — to put in more effort” into fighting the epidemic, said Dr. Keiji Fukuda, formerly a colleague of Meltzer’s at CDC who is now assistant director-general of the World Health Organization (WHO).
Columbia University’s Jeffrey Shaman, a modeling leader, invoked the perception that existed when Meltzer was given his assignment. As far as Ebola epidemics go: “We’d never seen anything like this before. This thing looked like AIDS on steroids,” he said.
Finding the right approach
The CDC is supposed to prepare the United States for the worst, so it makes sense for CDC modelers to explore extreme scenarios. If Meltzer’s estimates push policymakers to bolster public-health defenses, it’s all to the greater good, some say.
There are others who think the result corrupts science and politics.
“Public-health officials are well aware that their statistics get used — and misused — to justify an increase in their funding” or to bolster vaccination campaigns and other efforts, said Peter Doshi, assistant professor at the University of Maryland School of Pharmacy.
Modeling — so poorly understood by the public, the media and many people in public health — provides an opportunity to bend numbers to support goals, he said.
Said David Ozonoff, a Boston University environmental health professor: “The way risk assessment is done in this country is the policymakers shoot the arrow and the risk assessors paint a target around it. There’s a flavor of this with modeling, too. If you say the purpose (of a modeling estimate) is motivational, that’s another way of saying it’s not scientific.”
Some say more of a separation between CDC administrators and the modelers might engender more trust in the numbers the agency uses. Perhaps an outside agency — a National Institutes of Health (NIH) division on public health, if one were created — could do the modeling and report their findings to CDC, said Lone Simonsen, a research professor at George Washington University who formerly worked at the CDC and NIH.
More immediately, the CDC could increase its collaboration with top academic modelers, she added.
Meltzer is not interested. He is wary of proposals for greater collaboration or reliance on nonagency modelers, especially during emergency situations. And more sophisticated models do not interest him.
“Accuracy for the sake of accuracy is merely interesting,” he said. “And interesting is not good enough.”