the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
Brief communication: Sea-level projections, adaptation planning, and actionable science
Abstract. As climate scientists seek to deliver actionable science for adaptation planning, there are risks in using novel results to inform decision-making. Premature acceptance can lead to maladaptation, confusion, and practitioner “whiplash”. We propose that scientific claims should be considered actionable only after meeting a confidence threshold based on the strength of evidence as evaluated by a diverse group of scientific experts. We discuss an influential study that projected rapid sea-level rise from Antarctic ice-sheet retreat but in our view was not actionable. We recommend regular, transparent communications between scientists and practitioners to support the use of actionable science.
- Preprint
(1092 KB) - Metadata XML
- BibTeX
- EndNote
Status: open (until 26 May 2024)
-
CC1: '"Actionable" for whom, in what decision context?', Robert Kopp, 15 Mar 2024
reply
I read this brief comment with interest, and found some core issues troubling.
Fundamentally, the authors discuss 'actionable' science, but they discuss it stripped of context. Actions are defined by the American Heritage Dictionary as 'organized activity to accomplish an objective'. Science cannot be judged to be actionable, or not, outside the context of an organized activity and an objective. It makes little sense to talk about something being 'actionable' in general, outside of a specific decision context.
The authors neglect the extensive literature on decision science and risk analysis relevant to using sea-level projections in adaptation decision making. For a relatively recent review, see Keller, K., Helgeson, C., & Srikrishnan, V. (2021). Climate risk management. Annual Review of Earth and Planetary Sciences, 49, 95-116, https://www.annualreviews.org/doi/abs/10.1146/annurev-earth-080320-055847.
In the specific context of communicating sea-level uncertainty and ambiguity, the authors should also see Kopp, R. E., Oppenheimer, M., O’Reilly, J. L., Drijfhout, S. S., Edwards, T. L., Fox-Kemper, B., ... & Xiao, C. (2023). Communicating future sea-level rise uncertainty and ambiguity to assessment users. Nature climate change, 13(7), 648-660, https://www.nature.com/articles/s41558-023-01691-8. Given the direct relevance, this latter omission is particularly surprising.
Why do the organized activity and the objective matter?
Broadly, high-end sea-level rise scenarios, including low-confidence processes, are valuable in flexible, adaptive decision-making. This is shown by a number of papers, but perhaps most clearly and directly for this context in a preprint by Feng et al. (https://doi.org/10.22541/essoar.170914510.03388005/v1 ).
Among other analyses, Feng et al. compare idealized protection schemes for Manhattan under (1) a static optimal approach, where a single sea wall elevation must be picked based on available knowledge today, and (2) a variety of dynamic approaches, where sea wall height can be periodically adjusted based on new information. (I focus particularly on the 'reinforcement learning' approach described therein).
They consider two cases where projects are planned under inaccurate sea-level rise projections: (A) where planning takes place under the SSP5-8.5 low-confidence projections but the reality corresponds to SSP2-4.5 medium-confidence projections, and (B) where planning takes place under the SSP2-4.5 medium-confidence projections by reality corresponds to the SSP5-8.5 low-confidence projections.
In the former case -- where high-end projections are used and reality underperforms -- the expected net present value cost is $2.3 billion, $1.0 billion more than with the correct (lower) distribution, if a static approach is taken. With a flexible approach, the expected net present value cost is $1.0 billion, just $0.1 billion more than if the correct distribution is chosen.
However, in the latter case -- where middle-of-the-road projections are used and reality overperforms -- the expected net present value cost is $15 billion, $12 billion more than with the correct (high-end) distribution if a static optimal approach is taken. With a flexible approach, the expected net present value cost is $3.9 billion, $0.9 billion more than if the high-end distribution had been used.
Costs associated with mis-estimating sea-level rise (Expected NPV, $ billion) Static plan Dynamic plan Overestimated sea-level distribution (use SSP5-8.5 LC, get SSP2-4.5 MC) $1.0 $0.1 Underestimated sea-level distribution (use SSP2-4.5 MC, get SSP5-8.5 LC) $11.6 $2.4 Thus, with a dynamic approach, using high-end projections that capture low-confidence processes makes a lot of economic sense. Such an approach cuts of the tail risk at relatively small additional cost. (In fact, the cost of a static optimal approach using the correct distribution in a middle-of-the-road world is more than the cost of using a dynamic approach with the overestimated, high-end distribution.)
However, with a static approach, the costs of getting the distribution wrong are more substantial (though an order of magnitude larger if the distribution is underestimated than if it is overestimated).
In truth, I think the concern the authors address is not one with scientists offering practitioners low-confidence, high-end projections as part of the domain of plausible futures. It is with how these projections are then used.
As the Feng et al. analysis, and others, indicate, the most economic approach given substantial uncertainty and ambiguity is most often the dynamic one. Where a static approach must be used, whether due to inability to undertake a dynamic approach or regulatory inflexibility, then benefit-cost theory tells us what needs to be taken into account in order to determine the best option. This includes:
1) The benefit in terms of reduced risk associated with choosing different adaptation levels
2) The cost in terms of additional adaptive expenditures in terms of choosing different adaptation levels
3) The discount rate used to tradeoff present adaptation costs and future harms
4) The risk aversion that determines how much weight is given to the high-end of the cost distribution
5) The ambiguity aversion that determines how much weight is given to different alternative probability distribution for sea level and thus for cost.
Where the costs and benefits of adaptation are comparable, discomfort will arise if regulatory guidance specifies a single adaptation target stripped of context, because a user's risk and ambiguity aversion applies to both the costs and benefits of adaptation, not just the benefits.
I suspect that the authors' concern with the actionability of projections incorporating low-confidence processes is misaimed. Given appropriately flexible decision frameworks, as Feng et al. show, we are better off incorporating such high-end projections.
Regulations that rigidly prescribe the use of specific high-end projections in static contexts, however, run the risk of leading to sub-optimal outcomes. It may be appropriate for policy to set discount, risk aversion, and ambiguity aversion levels for specific contexts; this is a matter where different political philosophies will lead to different judgements. However, given these parameters, identifying the benefit-cost optimal outcome requires considering the net value of adaptation benefits and adaptation costs under these parameters. If costs and benefits are comparable, overly rigid targets might cut off the long tail of sea-level harms but create a long tail of adaptation cost overruns.
In short, the authors have chosen the wrong target. Scientists should strive to communicate not just projections that incorporate processes for which there is a high degree of evidence, but also processes that are of potentially great significance but less agreement and evidence -- as AR6 has done. It is, however, important that actions be guided by decision frameworks that correctly reflect the nature of the information provided.
Citation: https://doi.org/10.5194/egusphere-2024-534-CC1 -
CC2: 'Comment on egusphere-2024-534', Chris P. Weaver, 18 Mar 2024
reply
I appreciate the opportunity to comment, since I would like to point out, and help correct, an error in the paper.
Specifically, on page 6, the authors have made the following statement about the influence of DeConto and Pollard (2016) on the U.S. interagency sea level rise scenarios report of Sweet et al. (2017) (my emphasis):
“To be consistent with “recent updates to the peer-reviewed scientific literature”, they issued an “Extreme” global mean sea-level projection of 2.5 m by 2100 for RCP8.5, exceeding the previous upper bound of 2.0 m based on Pfeffer et al. (2008). Their Extreme projection relied on a large AIS contribution, based primarily on DP16. [Footnote 4]
[Footnote 4] The 2.0 m upper bound of Pfeffer et al. (2008) assumed large contributions from both ice sheets: 0.54 m from the GrIS and 0.62 m from the AIS. In AR5, Church et al. (2013) estimated a likely upper bound of just 0.21 m for the GrIS, since process models do not support “the order of magnitude increase in flow” in Pfeffer et al. (2008). To reach an upper bound of 2.0 m or more, S17 therefore needed an increased AIS contribution of ∼1.0 m or more. To support this increase, they cited DP16 along with the expert-judgment assessment of Bamber and Aspinall (2013). However, the latter study gave a high-end (95th percentile) estimate of 0.84 m SLR from the two ice sheets, less than Pfeffer et al. (2008). Other studies cited by S17 did not give independent evidence of a large AIS contribution. Thus, both the “High” projection of 2.0 m and the “Extreme” projection of 2.5 m in S17 relied on DP16’s claim that the AIS could contribute at least a meter of SLR by 2100."
The highlighted statement is a misstatement of fact. The accompanying footnote is also erroneous, as well as the part of the statement, on page 7, that references DeConto and Pollard (2016) in the context of Sweet et al. (2017), i.e., “(2) S17 and other reports that were published in 2016–2020 and relied on DP16 for the AIS contribution.”
Here is the relevant paragraph from page 14 in Sweet et al. (2017):
“The growing evidence of accelerated ice loss from Antarctica and Greenland only strengthens an argument for considering worst-case scenarios in coastal risk management. Miller et al. (2013) and Kopp et al. (2014) discuss several lines of arguments that support a plausible worst-case GMSL rise scenario in the range of 2.0 m to 2.7 m by 2100: (1) The Pfeffer et al. (2008) worst-case scenario assumes a 30-cm GMSL contribution from thermal expansion. However, Sriver et al. (2012) find a physically plausible upper bound from thermal expansion exceeding 50 cm (an additional ~20-cm increase). (2) The ~60 cm maximum contribution by 2100 from Antarctica in Pfeffer et al. (2008) could be exceeded by ~30 cm, assuming the 95th percentile for Antarctic melt rate (~22 mm/year) of the Bamber and Aspinall (2013) expert elicitation study is achieved by 2100 through a linear growth in melt rate. (3) The Pfeffer et al. (2008) study did not include the possibility of a net decrease in land-water storage due to groundwater withdrawal; Church et al. (2013) find a likely land-water storage contribution to 21st century GMSL rise of -1cm to +11 cm. Thus, to ensure consistency with the growing number of studies supporting upper GMSL bounds exceeding Pfeffer et al. (2008)’s estimate of 2.0 m by 2100 (Sriver et al., 2012; Bamber and Aspinall, 2013; Miller et al., 2013; Rohling et al., 2013; Jevrejeva et al., 2014; Grinsted et al., 2015; Jackson and Jevrejeva, 2016; Kopp et al., 2014) and the potential for continued acceleration of mass loss and associated additional rise contributions now being modeled for Antarctica (e.g., DeConto and Pollard, 2016), this report recommends a revised worst-case (Extreme) GMSL rise scenario of 2.5 m by 2100.”
In developing the report, we wished to provide a number of scenarios that could be used to fully bracket the evidence base for the physically possible 21-st century sea level rise, as well as providing expert judgement about the central tendency/best guess trajectory. These ranged from 0.3 m at the lowest of the lower bounds to 2.5 m at the uppermost of the upper bounds. Briefly, the motivation was to support as wide a possible range of decision contexts as existed at the time in coastal risk planning and management (e.g., see Hinkel et al., 2015, Nature Climate Change, and many others), including long-term adaptation pathways approaches and “stress test” type applications, both of which often use a “not-to-be-exceeded,” upper bound metric of performance.
As described in the quoted paragraph from Sweet et al. (2017), above, we arrived at the 2.5 m upper bound by synthesizing a number of lines of evidence from numerous studies, as well as the IPCC AR5, to individually interrogate the physically possible ranges of the contributing components to global-mean sea level rise. This was new evidence, and/or new synthesis of that evidence, since Pfeffer et al. (2008), the study that helped define the physically possible upper bound for a preceding U.S. interagency sea level rise scenarios report (Parris et al., 2012).
All of these studies predated the publication of DeConto and Pollard (2016); we had already decided on the 2.5 m upper bound, and completed most of the work of developing the global and regional scenarios, before that paper was published. As just one example, Kopp et al. (2014) estimated 2.45 m as the 99.9 percentile outcome for global-mean sea level rise in 2100 under RCP8.5. Once DeConto and Pollard (2016) was published, we added it to our citation list as another piece of evidence, but the conclusions of that paper had no influence on our choice of 2.5 m as the upper bounding scenario. The successor report to Sweet et al. (2017), i.e., Sweet et al. (2022), stated this clearly, as well (e.g., see page 11): “In Sweet et al. (2017), these scenarios were developed to span a range of 21st-century GMSL rise from 0.3 m to 2.5 m. Sweet et al. (2017) built these scenarios upon the probabilistic emissions scenario–driven projections of Kopp et al. (2014).”
The bottom line is that, if DeConto and Pollard (2016) had never been published, we would have written exactly the same report at the time that we wrote it.
In closing, I wanted to note that, on the initiative of one of the authors of this brief communication (DB), he and a number of others of us (including myself and Kopp, as well as DeConto) spent substantial time in productive discussions of the very points I have just summarized, and related topics, in the broader context of the nuances of using cutting-edge sea level rise science to support decision-making. These extensive discussions following the publication of Sweet et al. (2017) resulted in an AGU presentation by DB (see https://par.nsf.gov/servlets/purl/10066643), and a written summary of our engagement (see https://acwi.gov/climate_wkg/minutes/final_agu_consensus_statement_probabilisitic_projections_dec_2017.pdf), both of which reflected a useful integration of our diversity of perspectives as scientists and practitioners.
Citation: https://doi.org/10.5194/egusphere-2024-534-CC2 -
RC1: 'Reply on CC2', Chris P. Weaver, 09 Apr 2024
reply
[Converting my community comment into a formal review comment, at the Editor's request]
I appreciate the opportunity to comment, since I would like to point out, and help correct, an error in the paper.
Specifically, on page 6, the authors have made the following statement about the influence of DeConto and Pollard (2016) on the U.S. interagency sea level rise scenarios report of Sweet et al. (2017) (my emphasis):
“To be consistent with “recent updates to the peer-reviewed scientific literature”, they issued an “Extreme” global mean sea-level projection of 2.5 m by 2100 for RCP8.5, exceeding the previous upper bound of 2.0 m based on Pfeffer et al. (2008). Their Extreme projection relied on a large AIS contribution, based primarily on DP16. [Footnote 4]
[Footnote 4] The 2.0 m upper bound of Pfeffer et al. (2008) assumed large contributions from both ice sheets: 0.54 m from the GrIS and 0.62 m from the AIS. In AR5, Church et al. (2013) estimated a likely upper bound of just 0.21 m for the GrIS, since process models do not support “the order of magnitude increase in flow” in Pfeffer et al. (2008). To reach an upper bound of 2.0 m or more, S17 therefore needed an increased AIS contribution of ∼1.0 m or more. To support this increase, they cited DP16 along with the expert-judgment assessment of Bamber and Aspinall (2013). However, the latter study gave a high-end (95th percentile) estimate of 0.84 m SLR from the two ice sheets, less than Pfeffer et al. (2008). Other studies cited by S17 did not give independent evidence of a large AIS contribution. Thus, both the “High” projection of 2.0 m and the “Extreme” projection of 2.5 m in S17 relied on DP16’s claim that the AIS could contribute at least a meter of SLR by 2100."
The highlighted statement is a misstatement of fact. The accompanying footnote is also erroneous, as well as the part of the statement, on page 7, that references DeConto and Pollard (2016) in the context of Sweet et al. (2017), i.e., “(2) S17 and other reports that were published in 2016–2020 and relied on DP16 for the AIS contribution.”
Here is the relevant paragraph from page 14 in Sweet et al. (2017):
“The growing evidence of accelerated ice loss from Antarctica and Greenland only strengthens an argument for considering worst-case scenarios in coastal risk management. Miller et al. (2013) and Kopp et al. (2014) discuss several lines of arguments that support a plausible worst-case GMSL rise scenario in the range of 2.0 m to 2.7 m by 2100: (1) The Pfeffer et al. (2008) worst-case scenario assumes a 30-cm GMSL contribution from thermal expansion. However, Sriver et al. (2012) find a physically plausible upper bound from thermal expansion exceeding 50 cm (an additional ~20-cm increase). (2) The ~60 cm maximum contribution by 2100 from Antarctica in Pfeffer et al. (2008) could be exceeded by ~30 cm, assuming the 95th percentile for Antarctic melt rate (~22 mm/year) of the Bamber and Aspinall (2013) expert elicitation study is achieved by 2100 through a linear growth in melt rate. (3) The Pfeffer et al. (2008) study did not include the possibility of a net decrease in land-water storage due to groundwater withdrawal; Church et al. (2013) find a likely land-water storage contribution to 21st century GMSL rise of -1cm to +11 cm. Thus, to ensure consistency with the growing number of studies supporting upper GMSL bounds exceeding Pfeffer et al. (2008)’s estimate of 2.0 m by 2100 (Sriver et al., 2012; Bamber and Aspinall, 2013; Miller et al., 2013; Rohling et al., 2013; Jevrejeva et al., 2014; Grinsted et al., 2015; Jackson and Jevrejeva, 2016; Kopp et al., 2014) and the potential for continued acceleration of mass loss and associated additional rise contributions now being modeled for Antarctica (e.g., DeConto and Pollard, 2016), this report recommends a revised worst-case (Extreme) GMSL rise scenario of 2.5 m by 2100.”
In developing the report, we wished to provide a number of scenarios that could be used to fully bracket the evidence base for the physically possible 21-st century sea level rise, as well as providing expert judgement about the central tendency/best guess trajectory. These ranged from 0.3 m at the lowest of the lower bounds to 2.5 m at the uppermost of the upper bounds. Briefly, the motivation was to support as wide a possible range of decision contexts as existed at the time in coastal risk planning and management (e.g., see Hinkel et al., 2015, Nature Climate Change, and many others), including long-term adaptation pathways approaches and “stress test” type applications, both of which often use a “not-to-be-exceeded,” upper bound metric of performance.
As described in the quoted paragraph from Sweet et al. (2017), above, we arrived at the 2.5 m upper bound by synthesizing a number of lines of evidence from numerous studies, as well as the IPCC AR5, to individually interrogate the physically possible ranges of the contributing components to global-mean sea level rise. This was new evidence, and/or new synthesis of that evidence, since Pfeffer et al. (2008), the study that helped define the physically possible upper bound for a preceding U.S. interagency sea level rise scenarios report (Parris et al., 2012).
All of these studies predated the publication of DeConto and Pollard (2016); we had already decided on the 2.5 m upper bound, and completed most of the work of developing the global and regional scenarios, before that paper was published. As just one example, Kopp et al. (2014) estimated 2.45 m as the 99.9 percentile outcome for global-mean sea level rise in 2100 under RCP8.5. Once DeConto and Pollard (2016) was published, we added it to our citation list as another piece of evidence, but the conclusions of that paper had no influence on our choice of 2.5 m as the upper bounding scenario. The successor report to Sweet et al. (2017), i.e., Sweet et al. (2022), stated this clearly, as well (e.g., see page 11): “In Sweet et al. (2017), these scenarios were developed to span a range of 21st-century GMSL rise from 0.3 m to 2.5 m. Sweet et al. (2017) built these scenarios upon the probabilistic emissions scenario–driven projections of Kopp et al. (2014).”
The bottom line is that, if DeConto and Pollard (2016) had never been published, we would have written exactly the same report at the time that we wrote it.
In closing, I wanted to note that, on the initiative of one of the authors of this brief communication (DB), he and a number of others of us (including myself and Kopp, as well as DeConto) spent substantial time in productive discussions of the very points I have just summarized, and related topics, in the broader context of the nuances of using cutting-edge sea level rise science to support decision-making. These extensive discussions following the publication of Sweet et al. (2017) resulted in an AGU presentation by DB (see https://par.nsf.gov/servlets/purl/10066643), and a written summary of our engagement (see https://acwi.gov/climate_wkg/minutes/final_agu_consensus_statement_probabilisitic_projections_dec_2017.pdf), both of which reflected a useful integration of our diversity of perspectives as scientists and practitioners.
The authors should remove language stating or implying a reliance of Sweet et al. (2017) on DeConto and Pollard (2016). That would be a good first step in helping the paper be considered for publication. Note that I do not, in any way, have any objection to the authors disagreeing with the decision in Sweet et al. (2017) to use 2.5m globally by 2100 as the top-end, bounding scenario on other grounds. Such a disagreement would simply have to be justified in terms of the totality of references and lines of evidence summarized above, absent any reliance on DeConto and Pollard, as well as the stated purpose of the use of a limiting upper-bound scenario in that report - in other words, the choice to include 2.5m not because it is at all likely, but precisely because it is very, very unlikely.
Finally, while my main concern is helping the authors correct this particular error, I do also largely agree with the criticisms outlined in Community Comment 1 (CC1: '"Actionable" for whom, in what decision context?', Robert Kopp, 15 Mar 2024). It would be good to see the authors respond to and/or address those in their revision.
I appreciate the authors spending the time and effort to grapple with these issues in the literature. I continue to be very supportive of having these types of issues and ideas discussed, and I believe the continuation of the dialogue through this paper is valuable.
Citation: https://doi.org/10.5194/egusphere-2024-534-RC1
-
RC1: 'Reply on CC2', Chris P. Weaver, 09 Apr 2024
reply
-
CC3: 'Comment on egusphere-2024-534 Defining the rules so we know when to break them', Rajashree Datta, 11 Apr 2024
reply
Actionable science calls for a higher standard for communication of science, including rating findings for reliability (as suggested by the authors). This is both because (a) even a great single paper typically only addresses a portion of earth system components and because (b) guidelines actually help articulate exceptions.
On the complexity of scientific findings vs. adaptation
Presumably, we would not present decision-makers with non-peer-reviewed SLR estimates and expect them to decide its merit based on the specific decision context. If we accept the current social production of science which is “peer review” (also imperfect), a higher standard for “actionable science” is simply a logical extension.
Keller et al., 2021 (mentioned by Dr. Kopp) specifically discusses the importance of “Linking the Required Disciplines” in the management of climate risk. As an example: climate change introduces both direct impacts of SLR to coastlines and to enhanced weather extremes inland. The impact on weather extremes is not typically the purpose of any singular scientific paper strictly focused, for example, on the impacts of ice sheet loss on SLR. To focus on the potential opportunity cost: If efforts are focused on coastlines now in response to extreme SLR scenario, what of the potential loss of funding to adapting to extreme weather scenarios inland? This problem is more acute in regions with fewer resources than discussed here (New York City), who may benefit from a clearer communication of the scientific consensus.
On guidelines and exceptions
Importantly, the authors do not advocate against limiting novel science, but rather avoiding its misrepresentation, i.e. “It is better to discuss such a claim, including the gaps in the evidence, than to disregard it.”. In another comment, Dr. Kopp suggests (in summary) that it is critical to present the long tail and leave room for a dynamic response, even where evidence is lacking, based on the extent of potential risk (and the associated benefit of more extensive adaptation). I see no meaningful contradiction between the need for a guideline presented by the authors and the presence of exceptions. In fact, the “exceptionality” here is still defined in reference to some guideline and underlying rationale, thus underlining the need for the guideline.
The IPCC acts as a first, but possibly inadequate, level of synthesis. In fact, the purview of the IPCC has expanded over time precisely to accommodate evolving needs. A continued discussion of not just the “what” of actionable science, but also the “why” (the philosophical underpinning, as this paper explores) can inform precisely when it is important to break with consensus and to focus research into novel claims. This is particularly true because, while the presumed audience for IPCC predictions and adaptation recommendations is “decision-makers”, there is a far larger population which needs to understand the philosophy governing adaptation priorities (and has less time to examine footnotes in the IPCC and specific case studies): voters.
Citation: https://doi.org/10.5194/egusphere-2024-534-CC3
Viewed
HTML | XML | Total | BibTeX | EndNote | |
---|---|---|---|---|---|
513 | 150 | 21 | 684 | 14 | 13 |
- HTML: 513
- PDF: 150
- XML: 21
- Total: 684
- BibTeX: 14
- EndNote: 13
Viewed (geographical distribution)
Country | # | Views | % |
---|
Total: | 0 |
HTML: | 0 |
PDF: | 0 |
XML: | 0 |
- 1