Lets talk a bit about forest management. There is growing acknowledgement among (some) policymakers that we need to tackle the combination of climate change, fuel buildup in our forests, and development in high-risk wildland urban interface areas.
First of all, we all acknowledge that climate change has played a major role in making wildfires worse. Human emissions of greenhouse gases have increased spring and summer temperatures by around 2C in the Western U.S. over the past century. 2/15
This has extended both the area and time periods in which forests burn; in parts of California, fire season is now 50 days longer. The recent NCA4 suggested that about half the increase in burned area in the Western U.S. since 1980s can be attributed to a changing climate. 3/15
However, even if we were to magically slash our emissions to zero tomorrow, the climate would simply stop warming, not return to the conditions of the 1970s. The best we can hope for is to make our current climate the new normal and avoid making things potentially much worse 4/15
To reduce the severity of wildfires in our current climate, we need to improve forest management. We need to deal with the legacy of a century of overzealous fire suppression efforts in ecosystems adapted to frequent low-level burns. 5/15
We need to start controlling fires instead of extinguishing them, thinning small trees in some regions and doing controlled burns to clear out accumulated fuels. Some estimates suggest that 20 million acres will need to be thinned and/or burned to minimize fire risk. 6/15
At the same time, we need to allow the best available science to guide us and avoid extreme logging of our public forests under the guise of fire mitigation. We need to work to return to a regime where we can both actively manage forests and control natural ignitions. 7/15
We also need to streamline regulations around prescribed burns and thinning, removing red tape that trades short-term improvements in air quality for orange-sky catastrophes down the road. 8/15
We need to work closely with communities to get buy-in for forest management solutions and tailor interventions to what works best for their surrounding ecosystem and their socioeconomic reality. What works for Malibu and Paradise may be quite different! 9/15
We need to work with and learn from native fire practitioners who understand the land and have generations of experience with effective management techniques. We also need to institute better liability protections for groups undertaking prescribed burns. 10/15
We need to work from communities out, intensively managing areas in the wildland urban interface, but also acknowledge the need to eventually do prescribed burns and other management in more remote wildland regions to avoid air quality disasters associated with megafires. 11/15
We need to provide significantly more resources to harden homes and communities, paying for ember-resistant vent screens, defensible space clearing, and other cost-effective risk-reduction measures. 12/15
But we also need to deal with the drivers behind much of the wildland-urban interface expansion in California: our limited housing stock and astronomical prices. More housing and more affordable housing in urban areas can go a long way to reducing assets at risk. 13/15
Overall, its past time we gave forest management and wildfire risk reduction the resources it deserves. The 1989 Loma Prieta earthquake caused $10 billion in damages, but we spent $70 billion on earthquake retrofits after it occurred. 14/15
Yet despite hundreds of billions in losses from wildfires over the past five years, we only spend a small fraction today on wildfire risk reduction than what we spend on earthquake safety. While simply throwing money at the problem won't solve it, more resources are essential. 16
• • •
Missing some Tweet in this thread? You can try to
force a refresh
The arc of the scenario universe is long, but it bends inevitably toward more realistic emissions.
A new paper outlining the emissions scenarios we will be using in the upcoming IPCC AR7 report notes that "the CMIP6 high emission levels (quantified by SSP5-8.5) have become implausible".
It outlines a yet-to-be-released high emissions scenario notably lower than the one (SSP5-8.5) used in the prior IPCC 6th Assessment Report:
This is a change that a number of us in the community have long advocated, going back to Justin Ritchie's work in 2017.gmd.copernicus.org/articles/19/26…
And in 2020 Glen Peters and I published a piece in Nature arguing that high emissions scenarios were no longer "business as usual", and that more realistic emissions make for better climate policy: nature.com/articles/d4158…
El Niño is coming, and it is shaping up to be a big one.
Over at The Climate Brink I've put together a compilation of the latest forecasts by different modeling groups. They suggest that we might see an event comparable in strength to what we saw in 2016.
This is based on a collection of 11 different models (and 455 individual ensemble members) all updated since the start of March. I've put an interactive version of the data up on the Climate Dashboard here: dashboard.theclimatebrink.com/#enso
While there remains a big spread in models (and some models only run through August), more than half the runs show a strong (>1.5C Nino3.4) event developing by August and a very strong event (>2C) by the end of the year.
As a rare climate scientist working in Silicon Valley, I've been drinking from the AI firehose a lot more than my peers. I thought it would be helpful to lay out my experiences of both the promise and pitfalls of using AI to accelerate scientific research.
As a bit of background, I've been working with these tools since late 2022, and seen firsthand how they have dramatically improved over time. I’ve also worked with frontier AI labs to evaluate how well LLMs answer climate questions, and to help enable AI tools to support scientific collaboration.
So what do AI tools do well for scientific work? In short, coding.
Scientists are generally not software engineers. Much of their coding is self-taught, and many struggle with writing code quickly, producing well-documented reproducible code, and fixing errors.
My new State of the Climate report over at @CarbonBrief finds that 2025 had the:
⬆️ Warmest ocean heat content
⬆️ Tied as second warmest surface temps
⬆️ Second warmest troposphere
⬆️ Record high sea level and GHGs
⬇️ Record low winter Arctic ice
Read the article here:
Ocean heat content increased by 23 billion trillion joules, which was around 39 times greater than global primary energy use this year. This is the largest rise in OHC since 2017; overall OHC has increased by over 500 zettajoules since the 1940s.carbonbrief.org/state-of-the-c…
@CarbonBrief 2025 tied with 2023 as the second warmest surface temperatures. It was nominally the second warmest in NASA and DCENT datasets, and third warmest in NOAA, Hadley, Berkeley, Copernicus, JRA-3Q, and China-MST. In all cases uncertainties overlap with 2023.
After a modest decline over the first half of the year (and after record 2024 warmth), global temperatures are ticking back up. The past two days have been the warmest on record for this time of year in ERA5 and the highest temperature anomalies since January.
With 26 days of October now reporting in ERA5, October 2025 will be the third warmest on record after 2023 and 2024.
Weather models expect global temperatures to remain relatively flat over the coming week as extreme Northern Hemisphere warmth persists, and anomalies (departures from normal) will be at or above the levels the highest levels any we've seen earlier in the year
Last week the German Meteorological Society warned that "the 3-degree limit could be exceeded as early as 2050".
While not possible to fully rule out, the assessed warming scenarios we published in the IPCC AR6 report find this to be extremely unlikely.
If we look a the full ensemble of CMIP6 models we see a small number (3 of 37 models) reaching 3C by 2050. However, these three have both too much historical warming (~2.2C in 2024) and what an unrealistically high climate sensitivity (>5C per doubling CO2) as we noted here: nature.com/articles/d4158…
However, if we constrain CMIP6 to match recent observed global temperatures, we see no models reaching 3C until at least 2060: carbonbrief.org/analysis-what-…