PERSONAL ESSAY

I have developed disease models. Here's why COVID-19 projections can seem inconsistent

A complex new problem requires a more complex model than our worn intuitions about the flu

Published May 25, 2020 8:00AM (EDT)

"This is your brain on Coronavirus" (Illustration by Salon/Getty Images/University of Washington)
"This is your brain on Coronavirus" (Illustration by Salon/Getty Images/University of Washington)

My house in the foothills of the Berkshires in rural Massachusetts is set back from a busy state highway. Every day, as I sit in my office mining large databases for clues about diseases, their treatments and outcomes, I hear the rumble of engines, the sirens of emergency vehicles, an occasional beeping of a horn. Some nights the silence is slashed by the whine of an eighteen-wheeler's brakes as the driver tests them before starting his descent into the valley.

On my way back from one of my walks, I sometimes find myself on the opposite side of this road. I understand the stark trade-off of getting myself to the other side: Do it carefully and get home in one piece, or become road kill. Because I've been crossing this highway for nearly two decades, this risk calculation is so well honed that I can take into account the distance and velocity of a car careening toward me to weigh the likelihood of a successful crossing. This is a model: my past experience informing prediction about my future.

We all use many such models on any given day. They are intuitive and simple and we hardly think about them. But introduce some modicum of complexity, as in the case of COVID-19, and the populace takes up arms. So let me demystify them, because once you get a look under the hood you may realize that the machinery is more like a John Deere Waterloo Boy than the space shuttle.

Despite their ubiquity, we have a conflicted relationship with models. It seems that when there is no politically motivated war on them, we don't object to their complexity. Most of us are avid consumers of weather forecasts, those monumentally complicated and largely obscure processes whose shockingly accurate results help us decide when to wear snow boots or sandals, which in New England may happen on the same day. In contrast, climate science — data-heavy and ever-evolving — engenders skepticism, as has COVID-19 modeling. I suspect it's because those with political and financial agendas fear that the authority of their dogma doesn't stand a chance when pitted against the transparency of science, so the only viable solution is to discredit the science and its practitioners. But you cannot discredit something so willing to observe and adapt. And you cannot discredit truth, or its closest approximation, to people armed with understanding.

But first, a little more about our everyday tasks, since they are the seeds from which our predictive thinking sprouts.

Have you ever found yourself at the post office but couldn't quite figure out how you got there? We hit the snooze button, brush our teeth, get dressed and make coffee with minimal deliberation. That's because these are all familiar chores that can be executed almost automatically. And even though we perform them by rote, there are many decisions implicit in each of these actions — decisions we do not agonize over, decisions we do not need to articulate to ourselves with clarity and transparency, but decisions nevertheless, all relying on past experiences to predict their consequences.

For example, how many more times can I hit snooze? Should I use a brush or a comb to deal with my unruly hair? Should I wear a sweater or is a tee-shirt enough? There are layers of decision nodes in each of these choices, laden with both risks and benefits. We have done all this before, and virtually no uncertainty remains about the outcomes of choosing one or the other. Snooze one too many times, and you may not get that promotion. Choose to skip brushing your teeth, and the boss may not even let you into her office. It's as much a done deal as anything that hasn't happened yet can be.

We teach our kids that actions have consequences. Learning the exact consequence of squishing a hornet with their hands creates a model in their minds to tuck away for a reflexive future decision.   

Daniel Kahneman — one of the fathers of the field of Behavioral Economics and a Nobel Laureate — and his colleagues called these "heuristics." I used to find the word "heuristic" scary and impenetrable, until I began to understand it. In his popular book "Thinking Fast and Slow," Kahneman translates this idea into what amounts to shortcuts we all use to predict the repercussions of our common decisions and actions. Those shortcuts are based on models our brains create, almost without our notice.

My evaluation of crossing the road is one such heuristic – something my so-called reptilian brain does automatically. We take it for granted that the sun will come up in the morning and set again at night, because this is how it's been since before our ancestors jumped out of trees and into caves. Or that the whine from the road in the night is almost certainly from the brakes of a tractor trailer and not a T. rex vocalizing on its way over to chomp me and my family. Because we have so much experience with mundane tasks and events, we have evolved not to waste valuable mental space to overthink them, or even to be explicit about how we know what we know.

This approach is fine and well for the quotidian — the teeth will most likely be brushed, the coffee made — but has its pitfalls when we rely on it in situations of greater uncertainty. 

Cue up something new and complex — say, a pandemic. There is much we know about how microorganisms behave, and specifically how coronaviruses behave. Few people dispute that infectious agents are, well, infectious, and can be transmitted from person to person. We are all familiar with the common cold and the more serious annual flu, and know roughly what to expect from each. But when a new virus makes an appearance, our mental models of what to expect fall apart. And yet we find it hard to kick them to the curb. After all, we are addicted to this fast thinking. But in a novel viral pandemic, especially a deadly one, we need more specific information than the general understanding that "people will get sick; some will die" in order to build an effective plan to minimize the pain of unnecessary infection and death, and the impact on the healthcare system, the economy, and our very way of life.

This is where a complex new problem requires a more complex model than our worn intuitions about the flu. Such modeling relies on all the information we can harvest from various sources about the virus itself, about its interactions with our human bodies, and about interactions of those bodies with one another. The magical part that processes and shapes all of this information into usable projections — from expected volume of infections to hospitalizations to deaths to potential economic impacts — is no magic at all, but math. In fact, much of it is nothing more than simple arithmetic any middle schooler could grasp.

One handy feature of a good model is the ability to feed different hypothetical scenarios into it — the "what ifs" that can help us identify viable mitigation strategies such as social distancing, temporary lockdowns or quarantines, and how they may affect viral spread and its consequences. Ideally, all of these inputs should be gleaned from rigorous scientific studies. But we cannot wait for perfect information in the midst of a pandemic, so modelers make the best guesses they can, based on the experience of virologists, doctors, epidemiologists, healthcare policy experts, economists and other relevant experts. Together these inputs, or assumptions, are the fuel for the engine of the model, and are critical to getting useful and accurate estimates on the back end of the mathematical machinery.

But even small errors in assumptions can get magnified in the resulting estimates. To bracket such errors, model builders factor in uncertainty in their inputs, which are expressed as those clouds of confidence ranges around pinpoints and lines you see on graphs. While the press favors the categorical singularity of a number in their reporting, the clouds serve as a humble acknowledgement that human beings cannot foretell the future exactly, no matter what soothsayers and clairvoyants will have you believe.  

As we cobble together an admittedly imperfect starting point predicated on the available information and many educated guesses, we also clarify what solid data we need to improve our predictions. In the case of a pandemic, we are interested in refining its probable arc in order to help develop effective interventions to abrogate it. Such data include who contracts the virus, how and where they contract it, what fraction of those who contract it ends up sick, and what fraction dies. For these inputs we need testing, contact tracing and accurate reporting from hospitals, nursing and funeral homes. We need this information to flow from towns to counties to states to the federal agencies, who can then act as central repositories of the most current data.

Without this we have mayhem; it's every man for himself. And in that case, don't blame the model, since it's only as good as the information it's fed. Not having all of the data is like crossing my busy state road with my eyes closed, and ear plugs in. You do the math.

Don't let the constant stream of new or differing projections turn you off: it is because different models a) may be based on different assumptions, and b) are updated to take into account our current present and future as they slide into our past. The best, most robust and stable models are collaborative and transparent and based on as much information as exists. It's those messy collaborations, modifications, and debates that finally yield a product that comes closest to reflecting reality.

And even this product remains a work in progress, subject to change as we learn more. That is the process of science, a constant spiraling around the truth, getting closer and closer, course-correcting, and continuing to spiral along.

Let's return to heuristics. The way I have translated the word to myself is as a quick impression, an habitual way of evaluating something, like the danger presented by an errant wasp who took a wrong turn into my living room and is now whirring frantically in its search for an exit. This simple framing has made the word less scary, as knowledge is apt to do. I know better than to expect consistency from my fellow Homo sapiens. But perhaps just knowing that what's under the hood of scientific predictions is not some dark magic, but rather common tools we use in our everyday life, will make people less apprehensive, especially if they have the option of observing how these engines create actionable information. 

It's mid-May, and we had a frost last night here in the hills. According to the weather forecast, in ten days the temperature will soar into the 80s. Such swings are wholly consistent with my understanding of climate change models, no matter how much I want to wish it away. At the same time, the leaves are unfurling, the grass is tall and vibrant and dotted with dandelions, and soon it will be time to plant the garden. I know this because the light signals to some deep ancestral parts of my brain that spring is here, despite the crazy temperatures. It's a model I have built over the years of living in New England, a model that is likely to outlive me, even if some of the inputs are sometimes hard to believe.


By Marya Zilberberg

Marya Zilberberg practiced as an ICU doctor, and now does health services research focused on infectious diseases in hospitalized patients. She lives in the foothills of the Berkshire Mountains in Western Massachusetts. Her work has appeared or is forthcoming in Tablet Magazine, Longreads, and Hippocampus, among others. She tweets as @murzee.

MORE FROM Marya Zilberberg


Related Topics ------------------------------------------

Covid-19 Editor's Picks Essay Models Projections Science