The West is burning. A spate of wildfires in California killed 10 people on Monday, in addition to the 8.5 million acres of the western United States that had already been set ablaze by late September. It’s the third-worst season on record.
Thousands of people have lost their homes to the flames, and cities across Montana and Idaho were so blanketed in smoke at times that residents were warned the air was unbreathable. In all, more than 30,000 firefighters have been dispatched to the front lines to stem the blazes.
The price tag for all this? More than $2 billion. And that’s just to contain the fires. Accounting for infrastructure damage and insurance payouts, this fire season will leave taxpayers with a bill likely to top $30 billion.
Meanwhile, wildfires are only getting worse across the West. It’s a foreboding trend: 2015 and 2012 were the only seasons with more acreage burned. Since the 1970’s, the number of large fires greater than 10,000 acres in size has increased seven-fold, and the official fire season has jumped from four to seven months long. In the Pacific Northwest alone, the area burned by uncontrolled wildfires each summer has grown by 5,000 percent over the last 40 years.
The situation becomes even more untenable when the looming threat of climate change is taken into account. As the West has warmed, the transition from spring snow to summer melt has come on earlier each year, summers are hotter than they once were, and droughts are more frequent. These climatological shifts combined to turn forests into tinderboxes this summer, as a series of heatwaves and droughts in early summer dried out trees just months after one of the wettest winters on record. In fact, one research group estimates that half of the forest area burned in the past 30 years can be attributed to climate change.
But a spark by itself is not enough to burn millions of acres - a wildfire needs fuel. And that is precisely what the US’s ineffective approach to forest management has provided.
Preparing to fail
The US Forest Service approaches fire prevention with a tactic known as “fuels reduction” – essentially, thinning out forests so that any wildfires that do start are not able to spread or intensify beyond firefighters’ ability to control it. This involves clearing dry brush that allow sparks to jump, cutting small trees that burn readily, and pruning low-hanging branches that enable fire to expand upwards into forest canopies.
But while fuels reduction works in theory, its effectiveness is negated in practice by the sheer size of the West. The US Forest Service is responsible for more than 190 million acres of land, and fuels reduction efforts are targeted, tree-specific, and almost entirely manual. They are performed on an acre-by-acre basis, and they must be repeated every 10 years to deal with new forest growth. The result is that only a tiny fraction of forests categorized as “high-risk” - with little documentation of the logic behind that designation - sees fuels reduction. Additionally, it is impossible to predict whether these patches will coincide with the location of next summer’s heat waves, the primary driver of annual wildfire geography.
It turns out that only 1 percent of wildfires each year actually burn forest lands directly adjacent to areas where fuels reduction was carried out. That means that the more than $350 million spent annually on fuels reduction results in virtually no difference in the destructive capacity of wildfires.
At the same time, the outlook for the efficacy of fuels reduction is worsening thanks to the paradoxical way that the US government funds fire prevention. In contrast to other federal agencies like FEMA, which has money set aside for disasters like hurricanes, the Forest Service does not have a dedicated wildfire response budget. As wildfires have grown and become more expensive to fight over the last several decades, the agency has resorted to a practice known as “fire borrowing”: spending money originally earmarked for fire prevention to corral large fires and protect communities. The result is a vicious cycle wherein a particularly bad fire season in one region impacts the ability of the Forest Service to practice fuels reduction in another region – leaving that area more likely to burn out of control the following season. And as climate change increases the likelihood of large fires each summer, the cycle of bankrupting fire prevention is only going get worse.
Fighting fire with fire
What if, instead of throwing enormous resources at preventing and – when that inevitably fails – fighting fires, we simply embraced wildfires? Wildfires are a natural part of the lifecycle of arid forests: while today’s fires may be larger and more intense than in the past, the total area burning each year is still less than that burned naturally and by Native Americans prior to European settlement across North America.
Given the expansion of communities, recreation, and forestry in wildfire-prone regions, returning to historical levels of burning is not a feasible solution. But adding more fire may be exactly what we need to create less destructive burns.
Prescribed burning, in which wildfires are deliberately set to targeted areas, has proven an effective and safe technique for reducing the intensity and spread of natural wildfires since the Forest Service began the practice in 1995. Yet it makes up less than 10 percent of the acreage that sees fuels reduction efforts annually.
These intentional fires are set during the wettest part of the year, when only the wood that is most liable to burn – the same wood that would enable a natural wildfire to intensify out of control during the summer – will ignite. In this way, prescribed burns achieve the same end goal as fuels reduction, but without the manual effort of combing through the entire West. And in contrast to natural summer wildfires, prescribed burns are highly controlled with firefighters on standby to ensure that the flames stay within a predesignated perimeter.
Part of the reason this hasn’t caught on is the combination of the human tendency to focus on short-term planning with the entrenched attitude within the US that all wildfires are bad. The risk of prescribed burns, a certain fire, is perceived to be much higher than a that of an uncertain natural wildfire - even though 99 percent of all prescribed burns remain within their predefined bounds while natural wildfires frequently consume entire communities. The immediate negative effects of smoke-filled air, charred forest, and short-term ecosystem damage are also prioritized over the potentially far worse effects of inaction. In addition, the Forest Service has faced an uphill battle communicating the benefits of prescribed burns thanks to the fact that Americans have been taught from an early age that wildfires are bad for communities and the environment - it’s become known as the “Smokey Bear effect.”
The same short-term focus, encoded in federal law, has also made conducting prescribed burns in certain high-risk regions legally challenging. For example, the Clean Air Act can in some cases prohibit prescribed burns due to the smoke they release, despite the likelihood that an uncontrolled natural wildfire will release far more smoke later in the summer. The Endangered Species Act can similarly impede burning: immediate, relatively mild damage by prescribed burns to forest ecosystems occupied by endangered species is barred, even if an intense natural wildfire may cause irreversible damage to the same ecosystem and its endangered species in the near future.
With climate change certain to make the annual wildfire season even more dire, the current situation is untenable. Where manual fuels reduction has failed, further crippled by fire borrowing with each new wave of devastating wildfires, prescribed burning offers a solution to the ever-growing stock of ready fuel distributed throughout our forests. Limiting the scale, rather than the number, of wildfires must be the new benchmark, and public opinion and environmental law must adapt to prioritize long-term ecosystem health over short-term discomfort. After all, with the West growing drier every year, the time has come to embrace wildfire – or else the fires may embrace us.
Kevin Pels: I was struck by the prevention inaccuracy you mentioned – only 1 percent of all that acreage burning being near areas that were cleared is astoundingly low. If we embrace prescribed burns, where should we embrace them to reduce the impact during wildfire season? Are there any tools or data that could be used to forecast ‘hotspots’ (sorry) for the next fire season? Because whether it’s clearing brush, low branches, debris and other burnable materials, or just straight-up burning them out of season when the risk of spread is lower, knowing where to burn seems to be a major part of the puzzle.
Benjamin Bell: I have a strange parallel to draw, and I’m curious if you agree: nuclear power. We know that nuclear power is a cleaner, more effective, and generally safer source of energy than coal, for example, but it still faces massive backlash in the public and on the Hill. This is likely because although the vast majority of reactors are safe and hum away without any attention, those that do have problems have big problems. Ninety-nine percent of prescribed burns never leave their controlled area, but what about the 1 percent that do? An ounce of prevention is worth a pound of the cure, but how much risk are we willing to tolerate prophylactically? It seems possible that support for prescribed burns (and nuclear power) could be increased by reminding people in a relatable way that the catastrophes these solutions are trying to prevent isn’t some abstract possibility, but is in fact an approaching reality.