the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
Opinion: The importance and future development of perturbed parameter ensembles in climate and atmospheric science
Abstract. A grand challenge in climate science is to translate advances in our fundamental understanding into reduced uncertainty in climate projections Model uncertainty, characterized for example by the spread of simulations of future climate projections, has changed little over the past few decades despite major advances in model complexity, resolution, and the growing number of intercomparison projects and observational datasets. Here we argue that the use of perturbed parameter ensembles (PPEs) would accelerate our understanding of uncertainty in its broadest sense and help identify strategies for reducing it. We make eleven recommendations for future research priorities, drawing on existing studies that use PPEs to guide model development and simplification, understand inter-model differences, more fully characterize the plausible spread in climate projections, formalize model calibration, define observational requirements, and investigate how interacting environmental conditions influence complex climate systems like cloud fields. These studies extend across climate, weather, atmospheric chemistry, clouds, aerosols and renewable energy using process-based high-resolution models through to global-scale models. Although increases in model complexity, resolution and intercomparison projects consume most computing resources today, we argue that, in synergy with these efforts, PPEs are essential for fully characterizing model uncertainty and improving model reliability, and that they should be prioritized when allocating those resources.
Competing interests: Ken Carslaw, Annika Oertel and Yun Qian are members of the ACP editorial board.
Publisher's note: Copernicus Publications remains neutral with regard to jurisdictional claims made in the text, published maps, institutional affiliations, or any other geographical representation in this paper. While Copernicus Publications makes every effort to include appropriate place names, the final responsibility lies with the authors. Views expressed in the text are those of the authors and do not necessarily reflect the views of the publisher.- Preprint
(899 KB) - Metadata XML
- BibTeX
- EndNote
Status: final response (author comments only)
- RC1: 'Comment on egusphere-2025-4341', Bjorn Stevens, 04 Oct 2025
-
RC2: 'Comment on egusphere-2025-4341', Bjorn Stevens, 12 Oct 2025
An addendum to my review that hopefully makes my reservations about the programme a bit more tangible.   Consider the paper  by Zhu et al., https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2021MS002776 where an enormous sensitivity is identified as a result of ostensible parameter sensitivities -- a limiter is a parameter, but not one that is usually identified as such.  Rather it is one of the thousands of parameters, some of which matter, but which are only identified as such after the fact.  Even then however it is not a parameter that can easily be mapped to another model.  Having the paper address such concrete examples would strengthen it.
Citation: https://doi.org/10.5194/egusphere-2025-4341-RC2 -
RC3: 'Comment on egusphere-2025-4341', Michael Schulz, 30 Nov 2025
The opinion paper by Carslaw et al. presents a well layed-out argumentation for leaving space for a PPE component in future model development. It presents a very interesting overview of PPE work in the field and I appreciate the effort of the authors to land on several recommendations for how to use PPEs. This paper serves its purpose to provide an “opinion” in the field of climate model development and beyond.
My suggestions for improving the paper are minor:Â
To the end of the abstract the authors mention, that PPEs should be prioritized when allocating computing resources. I am not sure what prioritize here means. Giving 1st priority to PPEs is to my opinion a too strong wording, since other methods and workflows using climate models may claim priority for good reasons. Preparing a bug-free, multi-purpose ESM model code may need considerable computing resources, often not appreciated by funding agencies neither. And, while having great potential, PPEs as such do not remove structural error or provide scenario simulations for different futures. Not all problems require a PPE. See also another “opinion” paper by Jones et. al, https://doi.org/10.5194/esd-15-1319-2024, on the use of ESMs and improved cooperation to develop them. A word of caution when mentioning priorities would be to the advantage of the paper.
Missing is possibly also a discussion of the challenges of PPEs. Why haven’t PPEs been used more often? There are obstacles for that. The PPE implementation in models, efficient launch scripts, the large amount of data, the demanding handling of a lot of data, the waste of resources on implausible model variants, all these are challenges. But I support very much that model development teams should consider the use of PPEs in their model development workflow, as opposed to only one-at-a-time-testing of parameter choices for tuning and model improvement.
Line 225 invites for asking me to add another word of caution: “PPEs provide the only means to disentangle structural and parametric causes of model–observation biases”. Single process investigation, varying one at a time parameter variation, has been used efficiently in the past to include more correct and important processes in climate models. Evaluation with multiple observations has been shown to be useful to find structural uncertainty of models, without a PPE.Â
Another “missing”:
The importance of parameterisation documentation. When exploring model diversity and comparing model sensitivities across models, understanding the code differences and details of parameterisation choices has been a long standing challenge in all MIPs. It becomes even more important when doing multi-model PPEs. How to do such documentation efficiently is still a challenge. It might be, though, a positive side effect of organising multiple model PPEs, that such documentation becomes more clear, apparent and accessible for understanding model differences. The authors state in the conclusion: “We started this opinion piece by pointing out that there are several essentially competing efforts in climate modelling – complexity, resolution and initial condition ensembles. To this list we add perturbed parameter ensembles.” I believe transparent model documentation should be added to this list, in particular when thinking also of human resources needed to do the modelling.ÂMicro remark:
A dot is missing after the first sentence in the abstract.Citation: https://doi.org/10.5194/egusphere-2025-4341-RC3
Viewed
| HTML | XML | Total | BibTeX | EndNote | |
|---|---|---|---|---|---|
| 2,441 | 152 | 24 | 2,617 | 26 | 29 |
- HTML: 2,441
- PDF: 152
- XML: 24
- Total: 2,617
- BibTeX: 26
- EndNote: 29
Viewed (geographical distribution)
| Country | # | Views | % |
|---|
| Total: | 0 |
| HTML: | 0 |
| PDF: | 0 |
| XML: | 0 |
- 1
On how I ticked the boxes above:
I signed the review.