Risk aversion is ruining science

Universities and grant institutions need to do better at rewarding failure to encourage scientific discovery

By Paul M. Sutter

Published May 3, 2022 8:45AM (EDT)

Broken beaker glass in science laboratory (Getty Images)
Broken beaker glass in science laboratory (Getty Images)

This article originally appeared on Undark.

Science is built on the boldly curious exploration of the natural world. Astounding leaps of imagination and insight — coupled with a laser like focus on empiricism and experimentation — have brought forth countless wonders of insight into the workings of the universe we find ourselves in. But the culture that celebrates, supports, and rewards the audacious mental daring that is the hallmark of science is at risk of collapsing under a mountain of cautious, risk-averse, incurious advancement that seeks merely to win grants and peer approval.

I've encountered this problem myself. Years ago, after I completed my Ph.D. in physics, I began doing postdoctoral research studying cosmic voids, the vast regions of almost nothing that dominate the volume of the universe. A few collaborators and I were using voids to understand the evolution of the cosmos, and we were also fascinated by voids as objects themselves. However, as I was applying to jobs beyond that postdoc, I was told multiple times by senior (and well-meaning) scientists that I should focus on something else. Something more mainstream. Something safer. (Today, thanks to the dogged determination of my collaborators, cosmic void analysis is now a part of most major upcoming galaxy surveys.)

My experiences were not singular. I've met many junior scientists who were given similar advice, and senior scientists — now that I number among their ranks — confide that their top priority is in achieving deltas: a physics jargon word that they use here to refer to tiny, incremental advances of their current research. They quietly concede that the tenure system, designed to give academics the freedom to safely explore new directions, rarely serves that purpose.

To be sure, some risk-aversion is to be expected in science. As research fields mature and scientists pick off more of the low-hanging fruit, the problems become harder, requiring more people and more resources to solve. It's also easy to fall into a trap of conformity. Graduate students work on the problems that their advisers find interesting, almost always probing a specific sub-problem of a much larger domain; junior scientists, under pressure to please the senior scientists who make grant and tenure decisions, opt for small, incremental advancements of existing knowledge over risky, high-payoff research lines; even senior researchers tend to choose research directions that their peers will approve of.

The realities of the current grant funding climate play a role, too. It's becoming increasingly difficult to get a federal grant. According to the National Science Foundation's annual merit review report, during most of the past decade the agency has funded around 20 percent of the research proposals it receives, down from roughly 30 percent in the 1990. Two-thirds of all grant awards go to researchers who are more than 10 years beyond their Ph.D., and proposals from these senior researchers are accepted at a higher rate than those of junior scientists, by about 5 percentage points.

The fierce competition has set the stage for a cultural shift from "What problems am I interested in?" to "What problems are likely to be funded?" Because without funding, the ability for a scientist to do science becomes severely limited.

This culture of risk-aversion is putting science itself at risk. An incremental approach to scientific study — where large collaborations spend enormous amounts of money to refine existing knowledge to greater degrees of precision — may win grant funding in the short term because it's a sure bet, but it cannot be sustained in the long run. Eventually, policymakers and the public will lose interest in science and disconnect from what makes science so illuminating and engaging: namely, discovery.

To prevent science from becoming yet another bureaucracy that exists only to perpetuate itself, scientists have to begin by making changes at a cultural level.

The first step is to reward risk. We need to allow scientists to make mistakes — to explore interesting research directions and find nothing of value. This goes double for junior scientists. They need the freedom to use their fresh perspectives to find new research directions that the senior scientists may have overlooked. We can reward risky bets by celebrating null results and non-results as much as we do great discoveries. After all, both paths lead to new knowledge, a fact that is not nearly recognized enough.

We can incentivize risk by hiring and promoting junior scientists who do something new, even if they weren't lucky enough to have it pay off. As long as an investigator shows sound intent, effort, and clarity of insight — the hallmarks of a great scientist — they should be considered for awards, positions, and prestige.

We can also reward risk in the grant proposal process. According to the NSF merit review report, the program officers who ultimately make the recommendation to award or deny a grant proposal are expected to consider, among other factors, "support for high-risk proposals with potential for transformative advances" when weighing their decisions. (That itself implies that riskiness is something to be managed, not celebrated). Indeed, the agency even supports some funding mechanisms that are specially designed to encourage research that is high-risk and high-reward, the most prominent being the Early-concept Grants for Exploratory Research program, or EAGER. In the past decade, however, EAGER grants have amounted to only 1 to 2 percent of NSF's allocated research grant money. Why can't it be 5 percent? Or 50 percent? What's stopping us? A fear of failure?

But perhaps just as important, scientists need to manage the expectations of the public and policymakers. For several decades now, scientists and their advocates have played a dangerous public relations game, essentially billing science as an institution that can deliver society a guaranteed rate of return on a given investment. Science is messy and full of mistakes and blind alleys. Discoverers of all kinds don't know what they're going to get until they go out and look for it themselves. We need to level with the taxpayer that scientists shouldn't be expected to always deliver promising new results. We need to celebrate — both internally in the scientific community and with the public at large — the so-called failures that also represent the growth of knowledge.

To give scientists the confidence and support they need to make truly great new discoveries, we need to get the public to see science as the daring act of inquisitiveness that it is. And that is a very, very risky thing, indeed.


Paul M. Sutter

MORE FROM Paul M. Sutter


Related Topics ------------------------------------------

Education Partner Research Science Undark