One under-appreciated finding in the IPCC AR6 is a lot more certainty around future warming.
Previously IPCC only gave "likely" warming ranges (e.g. a 2 in 3 chance of falling in the range). New report gives "very likely" 9 in 10 ranges. Here is a rough like-to-like comparison:
The IPCC AR5 future warming projections were nominally based on the 90th percentile of CMIP5 models, but the assessed range of climate sensitivity was much wider than the range in CMIP5 models, so these were treated "likely" (66th percentile) ranges.
The AR6, on the other hand, bases its warming projections on a combination of observationally-constrained CMIP6 models and a simple energy balance model using the new transient climate response (TCR) and equilibrium climate sensitivity (ECS) values in the report.
Per the AR6 WG1 Chapter 4: "Because different approaches... produce consistent results, there is high confidence in this assessment. These ranges... generally correspond to AR5 ranges... but likelihood is increased to very likely ranges, in contrast to the likely ranges in AR5."
Here is what the published RCP AR5 warming projections look like compared to the SSP AR6 ones; note that the AR5 ranges are "likely" 66th percentile ranges and the AR6 ones are "very likely" 90th percentile ranges.
There were no "very likely" ranges published in the AR5 to allow a direct comparison of scenarios. However, if we scale the AR5 projections by the difference between "likely" and "very likely" climate sensitivity (ECS) ranges we can get a rough estimate:
As an aside, TCR would probably be better than ECS to use for scaling these, but as far as I can tell the AR5 did not provide a "very likely" TCR range.
This increased confidence in future warming projections was in part a result of a narrowing of the range of climate sensitivity in the AR6 through a combination of multiple lines of evidence, following the Sherwood et al review: agupubs.onlinelibrary.wiley.com/doi/full/10.10…
So why is this important? Narrowing the range of future warming represents both good and bad news: good news that some of the very high end warming outcomes now seems less likely, but bad that we much less likely to get lucky and end up with less warming than we expected.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Whenever I post about climate, skeptical folks inevitable respond with this graph. So I decided to do something radical: actually read the underling scientific paper and ask the authors.
As it turns out, it actually says the opposite of what skeptics claim.
Rather than arguing against human influence on the climate, the paper makes the stark claim that "CO2 is the dominant driver of Phanerozoic climate [the past 485 million years], emphasizing the importance of this greenhouse gas in shaping Earth history."
Changes in temperature, it turns out, have been strongly correlated with CO2. Even more strongly than the authors expected when they set out to create a 485 million year reconstruction. CO2 is both a forcing (e.g. from volcanism) and a feedback (from solar forcing) at different points.
Every wildfire starts with an ignition – downed powerlines, lightning, arson – and we can do a lot to reduce these.
But in California the number of fires has dropped while the area burned has doubled. What has changed is conditions, not ignitions:
Why have conditions changed? A legacy of poor forest management has led to fuel loading (particularly in the Sierras), contributing to more destructive fires. But vegetation has also gotten much drier as fire season temperatures have warmed (+3.6F since 1980s)
We've historically seen the most destructive fires in hot and dry years. Human emissions of CO2 and other greenhouse gases are the primary cause of increased temperatures in California.
I have a new paper in Dialogues on Climate Change exploring climate outcomes under current policies. I find that we are likely headed toward 2.7C by 2100 (with uncertainties from 1.9C to 3.7C), and that high end emissions scenarios have become much less likely
This reflects a bit of good news; 2.7C is a lot better than the 4C that many thought we were heading for a decade ago, and reflects real progress on moving away from a 21st century dominated by coal. At the same time, its far from what is needed.
It does raise an interesting question: how much of the change in likely climate outcomes relative to a decade ago reflects actual progress on technology and policy vs assumptions about the future (e.g. 5x more coal by 2100) that were always unrealistic.
I have a new analysis over at The Climate Brink exploring how rates of warming have changed over the past century.
Post-1970, GHGs (CO2, CH4, etc.) would have led to just under 0.2C per decade, but falling aerosols (SO2) have increased that rate to 0.25C.
These falling aerosols have "unmasked" of some of the warming that would have otherwise occurred due to past emissions of greenhouse gases. Its been driven by large declines in Chinese and shipping SO2 emissions over the past decade, among other contributors.
Now, a flat rate of warming from GHGs at just under 0.2C per decade might seem a bit unexpected. After all, CO2 emissions have continued to increase, and atmospheric CO2 concentrations have grown year over year.
Theres been a bit of confusion lately around how the climate system response to carbon dioxide removal. While there are complexities, under realistic assumptions a ton of removal is still equal and opposite in its effects to a ton of emissions.
A thread: 1/x
When we emit a ton of CO2 into the atmosphere, a bit more than half is reabsorbed by the ocean and the biosphere today (though this may change as a warming world weakens carbon sinks). Put simply, 2 tons of CO2 emissions -> 1 ton of atmospheric accumulation.
Carbon removal (CDR) is subject to the same effects; if I remove two tons of CO2 from the atmosphere, the net removal is only one ton due to carbon cycle responses. Otherwise removal would be twice as effective as mitigation, which is not the case.
The carbon cycle has been close to equilibrium through the Holocene; we know this because we measure atmospheric CO2 concentrations in ice cores. But in the past few centuries CO2 has increased by 50%, and is now at the highest level in millions of years due to human emissions.
Starting 250 years ago, we began putting lots of carbon that was buried underground for millions of years into the atmosphere. All in all we’ve emitted nearly 2 trillion tons of CO2 from fossil fuels, which is more than the total mass of the biosphere or all human structures:
About a trillion of that has accumulated in the atmosphere, increasing CO2 concentrations to levels last seen millions of years ago. The remainder was absorbed by the biosphere and oceans. We can measure these sinks, and it’s incontrovertible that they are indeed net carbon sinks