Wildfires in the United States are getting more common and more intense. But is devastation like we saw in Los Angeles the inevitable result of a warming planet?
A new paper suggests it is not.
By analyzing fire scars on tree rings in sites across the United States, the researchers determined that the average burn rate in recent decades (1984-2022) was less than a quarter of the average rate between 1600 and 1880.
Below, you can see that the entire burn rate distribution has moved to the left. In 2020, the year with the highest burn rate in the contemporary period, 6 percent of sites burned, one percentage point less than the average historical burn rate of 7 percent.
All that means the modern United States is experiencing a huge “fire deficit”—much less of the country is burning, year over year, than in the period between 1600 and 1880, thanks in large part to human intervention. Fewer wildfires is generally a good thing, but it has a dark side: unburned landscapes build up flammable material over time, increasing fire risk and making fires more intense when they do occur.
On the bright side, this research also implies that the rate and intensity of future wildfires are well within our control. Climate researcher Roger Pielke Jr. explains:
The large human influence on forests does not mean that the climate has not changed in ways that also influence fires, but it does mean that the management (intentional or not) of ecosystems has played a much larger role in shaping fire behavior than has any plausible effect of greenhouse gas emissions on climatic conditions that favor fire.
Consider Figure 2a above which shows that 6% of sites burned in 2020. Let’s just assume that without changes in climate caused by greenhouse gas emissions this number would have been just 3%. So in this thought experiment climate change doubled the number of sites burned.
However, in the absence of contemporary changes in climate, 29% of sites burned in 1748. That means that human changes to forest ecosystems were about an order of magnitude more important than assumed human-caused changes in climate (i.e., 29%-to-3% versus 3%-to-6%).
Of course, this thought experiment dramatically overstates the expected role that changes in climate play in fire occurrence—meaning that in the real world forest management is certainly much more than an order-of-magnitude more important.
With proper forest management techniques, such as controlled burns and mechanical thinning, we can prevent severe wildfires, even as climate change makes them relatively more likely.
Economics & Development:
Energy & Environment:
Food & Hunger:
Health & Demographics:
Progress Is Being Made Against Malaria in South-East Asia Region
First Person with Eye and Face Transplant Is Recovering Well
Meet the World’s First Recipient of an AI-Powered Bionic Arm
AI Is Fixing the Voices of People with Motor Neuron Diseases
Cheap Blood Test Detects Pancreatic Cancer Before It Spreads
Semaglutide Helped with Alcohol Use Disorder In Clinical Trial
Science is a poor tool for truth but it's the best tool we have. The politicizing of it, to try to get ones interests implemented, renders it even weaker. It also, in the long run, enhances skeptism and disdain for it among the masses.
The subtitle seems inappropriate. The history shows that the more we have tried to suppress and control wildfires, the worse the outcomes. You don't go on to show how we can control them. Obviously, we can do a much better job of reducing the danger of wildfires with sensible management that differs greatly from recent decades, but with current capabilities it is a big stretch to say that we can control them without qualification.