the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
Validating a microphysical prognostic stratospheric aerosol implementation in E3SMv2 using the Mount Pinatubo eruption
Abstract. This paper describes the addition of a stratospheric prognostic aerosol (SPA) capability – developed with the goal of accurately simulating aerosol formation following explosive volcanic eruptions – in the Department of Energy (DOE) Earth Energy Exascale Model, version 2 (E3SMv2). The implementation includes changes to the 4-mode Modal Aerosol Module microphysics in the stratosphere to allow for larger particle growth and more accurate stratospheric aerosol lifetime following the Mt. Pinatubo eruption. E3SMv2-SPA reasonably reproduces stratospheric aerosol lifetime, burden, and aerosol optical depth when compared to remote sensing observations and the interactive chemistry-climate model, CESM2-WACCM. Global stratospheric aerosol size distributions identify the nucleation and growth of sulfate aerosol from volcanically injected SO2 from both major and minor volcanic eruptions from 1991 to 1993. Modeled aerosol effective radius is consistently lower than satellite and in-situ measurements (max differences of ~30 %). Comparisons with in-situ size distribution samples indicate that this underestimation is due to both E3SMv2-SPA and CESM2-WACCM simulating too small of accumulation and coarse mode aerosol 6–18 months post-eruption, with E3SMv2-SPA simulating ~50 % the coarse mode geometric mean diameters of observations 11 months post-eruption. Effective radii from the models and observations are used to calculate offline scattering and absorption efficiencies to explore the implications of smaller simulated aerosol size on the Mt. Pinatubo climate impacts. Scattering efficiencies at wavelengths of peak solar irradiance (~0.5 µm) are 10–80 % higher for daily samples in models relative to observations through 1993, suggesting higher diffuse radiation at the surface and a larger cooling effect in the models. Absorption efficiencies at the peak wavelengths of outgoing terrestrial radiation (~10 µm) are 15–40 % lower for daily samples in models relative to observations suggesting an underestimation in stratospheric heating in the models. The similar performance of CESM2-WACCM and E3SMv2-SPA makes E3SMv2-SPA a viable alternative to simulating climate impacts from stratospheric sulfate aerosols.
-
Notice on discussion status
The requested preprint has a corresponding peer-reviewed final revised paper. You are encouraged to refer to the final revised version.
-
Preprint
(5223 KB)
-
Supplement
(4205 KB)
-
The requested preprint has a corresponding peer-reviewed final revised paper. You are encouraged to refer to the final revised version.
- Preprint
(5223 KB) - Metadata XML
-
Supplement
(4205 KB) - BibTeX
- EndNote
- Final revised paper
Journal article(s) based on this preprint
Interactive discussion
Status: closed
-
CEC1: 'Comment on egusphere-2023-3041', Juan Antonio Añel, 26 Jan 2024
Dear authors,
Unfortunately, after checking your manuscript, it has come to our attention that it does not comply with our "Code and Data Policy".
https://www.geoscientific-model-development.net/policies/code_and_data_policy.html
You have archived your code on GitHub. However, GitHub is not a suitable repository for scientific publication. GitHub itself instructs authors to use other alternatives for long-term archival and publishing, such as Zenodo. Therefore, please, publish your code in one of the appropriate repositories, and reply to this comment with the relevant information (link and DOI) as soon as possible, as it should be available before the Discussions stage.In this way, if you do not fix this problem, we will have to reject your manuscript for publication in our journal. I should note that, actually, your manuscript should not have been accepted in Discussions, given this lack of compliance with our policy. Therefore, the current situation with your manuscript is irregular.
Also, you must include in a potentially reviewed version of your manuscript the modified 'Code and Data Availability' section, the DOI of the code (and another DOI for the dataset if necessary).
Please, remember including a license for your code in the new repository. If you do not include a license, the code continues to be your property, and nobody can use it. Therefore, when uploading the model's code to Zenodo, you could want to choose a free software/open-source (FLOSS) license. We recommend the GPLv3. You only need to include the file 'https://www.gnu.org/licenses/gpl-3.0.txt' as LICENSE.txt with your code. Also, you can choose other options that Zenodo provides: GPLv2, Apache License, MIT License, etc.
Juan A. Añel
Geosci. Model Dev. Executive EditorCitation: https://doi.org/10.5194/egusphere-2023-3041-CEC1 -
AC1: 'Reply on CEC1', Hunter Brown, 30 Jan 2024
Hello Juan,
Sorry about the incorrect referencing for the model code. I have uploaded our E3SM code to zenodo at the following link (https://doi.org/10.5281/zenodo.10593881) and have modified the code availability statement to read as follows:
" 6. Code Availability
The model code based used to generate E3SMv2-SPA and E3SMv2-PA – along with information for how to access the publicly available CESM2-WACCM code base – can be found on Zenodo at https://doi.org/10.5281/zenodo.10593881. Plotting and processing scripts used in the analyses of this paper can be found on Figshare at https://doi.org/10.6084/m9.figshare.24844815.v1."
Thank you and let me know if you need anything else.
-Hunter
Citation: https://doi.org/10.5194/egusphere-2023-3041-AC1 -
CEC2: 'Reply on AC1', Juan Antonio Añel, 31 Jan 2024
Dear authors,
Thanks for publishing your code in Zenodo. However, the extension of the file you uploaded containing the code is wrong. You have named the file "sandia_e3sm_spa_code.gz". However, the extension of the file is wrong, as it is not a file compressed with the ZIP algorithm but a simple tarball file. See below:
$ file sandia_e3sm_spa_code.gz
sandia_e3sm_spa_code.gz: POSIX tar archive
This can make some people try to access the file and can not do it because their operating system or environment will not be able to recognize the file correctly, leading to an error when opening it. Therefore, please, upload a new version of your file to the repository (and post here the link and update the Code Availability section accordingly), which should be named with the .tar extension. Another option is to keep the name but upload a compressed ZIP file.Many thanks,
Juan A. Añel
Geosci. Model Dev. Executive Editor
Citation: https://doi.org/10.5194/egusphere-2023-3041-CEC2 -
AC2: 'Reply on CEC2', Hunter Brown, 01 Feb 2024
Sorry about that. I have changed it to a .tar file and have updated the code availability comment with the correct zenodo version link:
" 6. Code Availability
The model code based used to generate E3SMv2-SPA and E3SMv2-PA – along with information for how to access the publicly available CESM2-WACCM code base – can be found on Zenodo at https://doi.org/10.5281/zenodo.10602682. Plotting and processing scripts used in the analyses of this paper can be found on Figshare at https://doi.org/10.6084/m9.figshare.24844815.v1."
Let me know if there are any additional issues.
Best, Hunter
Citation: https://doi.org/10.5194/egusphere-2023-3041-AC2
-
AC2: 'Reply on CEC2', Hunter Brown, 01 Feb 2024
-
CEC2: 'Reply on AC1', Juan Antonio Añel, 31 Jan 2024
-
AC1: 'Reply on CEC1', Hunter Brown, 30 Jan 2024
-
RC1: 'Comment on egusphere-2023-3041', Zachary McGraw, 18 Feb 2024
Brown et al present a detailed and well-written documentation of their modifications and validation of E3SM, which I believe following some mostly modest textual changes can be ready for publication. Replicating the 1991 Mt. Pinatubo eruption is an important test case for climate models having interactive stratospheric microphysics. These models exist partly to understand the aerosol development and evolution during observed eruptions, and more commonly as a means to evaluate cases that lack aerosol data that otherwise could be prescribed into simpler model versions (e.g. eruptions of the distant past, solar radiation modification). As Pinatubo is the most clearly observed case of stratospheric aerosols altering global climate, being able to reasonably replicate its aerosol layer is essential for establishing credence in further experiments. The presented E3SM version is not flawless here and the main code changes are reboots from a related model (Mills et al, 2016). But documenting and testing this version is important for the interpretation of future model uses, remaining biases are honestly presented, and this does replicate aerosol properties slightly above average compared to other models (cf ISA-MIP). I believe it’s important that the authors add discussion of how the reported issues can affect future uses of the model, and better delineate what is and is not verified by this evaluation, which is complicated by reliance on offline radiation calculations instead of full reliance on E3SM, as well as the use of nudged winds. I also think the authors can better explain the study’s purpose and clarify some methodological choices, but if the authors make a reasonable effort to address these comments (detailed below) I think this can be suitable for publication relatively quickly, and so have selected minor revisions.
Main comments
The study reports multiple reasons to expect this model version will overstate stratospheric aerosol impacts on climate but does not include a paragraph explaining the ramifications for future uses. I expect this validation study was made largely to be cited by future studies on non-Pinatubo experiments as a reason to have confidence in the model version, or could be even if not the intention. Biases are hence important to put into context. The two issues of overly small aerosols and ammonium sulfate optical constants (to represent sulfuric acid, which absorbs more LW) will both exaggerate the surface cooling, as well as the precipitation response to a cooler lower troposphere. For a case of roughly similar stratospheric mass as Pinatubo (potentially including SRM), the results suggest this model version would give cooling – which is not presented here – on the edge of what should be acceptable as ‘scientifically ready’ (I approximate a net forcing bias of ~25-50% based on these two biases). If this model version were used for an eruption multiple times the mass of Pinatubo – and maybe there is no intent for this, but an external user could presumably do so on their own – the net forcing bias would balloon, due to the shortwave and longwave effects becoming both very large and more closely offsetting one another. There’s a long tradition in the volcano and SRM communities of using models with strongly exaggerated aerosol forcings as the basis for arguments on dire consequences (or detectability) of stratospheric aerosols, so it’s important for caveats to be laid out at this early stage. Could the authors please add a paragraph (~5-8 sentences) to the Discussion section to guide future users on what their results imply, putting into a useful context some of the issues mentioned here? Also please make clear what is and is not verified in this study, as E3SM’s stratospheric winds and radiation scheme being sidelined in these experiments complicates the ability for studies on future experiments to point to this study as validation of model reliability. I feel the authors have been transparent on their specific results, which is commendable, but just need to tie things together for future users and readers of upcoming works that use this model version.
I also think this manuscript would be easier to appreciate if the authors add 3-5 sentences explaining the purpose of this study and model version in the Introduction, as currently this is extremely brief (lines 64-66). First off, to please explain more clearly why the focus is on the Pinatubo test case and why interactive aerosol microphysics is useful to represent. This model version is surely not meant to just replicate Pinatubo’s impacts, for which interactive aerosols do not need to be simulated because there are satellite retrievals of extinction and size retrievals. I offer some reasoning on why interactive models and the Pinatubo test case are important in this review’s first paragraphs – maybe the authors’ reasons are different or they are unable to reveal specifics, but it would be good to see more explanation here. Second, is this version only for very specific stratospheric aerosol experiments by a small group, or can (and should?) anyone familiar with E3SM easily run the “SPA” version in the GitHub or make the modifications themselves for diverse stratospheric aerosol cases? Third, the authors made an effort by retuning to get the troposphere right, so I’d like to see some statement on whether the authors view this as satisfactory for a full experiment including troposphere and stratospheric responses (e.g. historical runs), or if this is unknown as more validation would be needed. Anything the authors can contribute to give the reader a better sense of this model version’s reason for existence and its suitable uses.
The manuscript would be more useful if it included the magnitudes of shortwave, longwave, and net forcings, as well as stratospheric warming, all of which should be standard E3SM outputs, attainable as eruption years minus pre-eruption period. So why not show these or any results that are a function of the model’s radiation code beside diffuse and direct radiation? Is this because the optical constants for sulfuric acid aren’t well represented, or E3SM’s radiation scheme isn’t yet set up to feed in interactive stratospheric composition? It’s understandable if only the aerosol properties are being verified within the scope of the present study, and possibly there are pertinent issues with the E3SM radiation scheme that are difficult to resolve. But to get no explanation is frustrating for the reader wanting to know whether or not this model can reliably replicate stratospheric sulfate’s climate impacts, and even more so for anyone trying to figure out if they want to use this model. As this is a GMD article on stratospheric aerosols in a climate model, can the authors at least be upfront in the manuscript (~2-3 sentences in Methods and/or Results) about why they don’t show the most climate-relevant outputs?
Specific comments
38. Can the authors please word the CESM2-WACCM part of this sentence a bit better, as it would seem obvious these would give similar results. CESM2-WACCM is an odd model choice for comparison, given it has much of the same code as E3SM so doesn’t serve as much of a benchmark.
50. It seems odd to tout this E3SM model as a useful alternative to an older model that uses much of the same code, including the same or very similar aerosol scheme. I would word this more logically or just focus on the E3SM-to-observation comparison here.
60. “net primary productivity of plants” or similar, as “productivity” alone is vague.
68. I would say you’re only “validating” against the observations, and separately that you’re comparing against CESM, as CESM is just a model whereas the observations are – despite their own flaws and uncertainties – the standard approximation of truth. Can the authors also please briefly explain their choice of CESM2-WACCM here? Showing the ISA-MIP Pinatubo models (Quaglia et al, 2023, already cited in the manuscript) would have given a better impression of this model’s performance against its peers. I understand that replicating an already verified model having many of the same features serves as a sanity check (and maybe some of the unique aspects of E3SM lead to improvement?), but this is worth a 1-line explanation in the text.
70. Saying “most climate models” use GloSSAC isn’t accurate. For some historical eruptions it’s an option, but then there’s the CMIP dataset cited elsewhere in this study (SAGE-3λ). And for eruptions in the distant past or hypothetical eruption cases, simplified forcing generators like Easy Volcanic Aerosol (Toohey et al., 2016) are now the standard option. I would just amend this into a more general statement.
Toohey, M., Stevens, B., Schmidt, H., & Timmreck, C. (2016). Easy Volcanic Aerosol (EVA v1. 0): an idealized forcing generator for climate simulations. Geoscientific Model Development, 9(11), 4049-4070.
82. “more complete approach”, maybe. But mostly these interactive models are used for cases where we lack suitable observations, or are isolating a particular microphysical effect (since we observe properties, but not processes). I think the need for good interactive aerosol models can be described better, which would also help tout why the rigorous technical work here is useful (see main comments).
107. Please summarize around this line the model version’s purpose (see main comments). Can anyone use this scheme? For what purposes is it suitable and for what experiments does this validation apply?
110-111. I’m not convinced that this study pays “more attention to […] global and regional climate impacts” than the cited Mills et al 2017 study. That study actually shows radiative forcings and surface temperature responses, while this one does not. I think there are novel aspects to this study – the radiation sections are for instance quite different from the two Mills et al studies – and that the authors should more accurately represent their uniqueness here.
158. It looks from Table S1 like the mode size cutoffs are exactly the same as in the Mills et al study, and the only difference is the dust and sea salt tuning. Can the authors please make this clear in the manuscript text?
180. It’s a bit odd the authors don’t show any maps of tropospheric aerosol to back these statements, though I’m willing to accept this is outside the range of this study. Can the authors at least make a statement if the model as modified here is ready for experiments where both the troposphere and stratosphere are important, or if further validation effort is needed then please say so. Is there any reason not to use this over the current E3SM, beside maybe the unavailability of long control runs? As is I feel anyone who reads this thinking they may want to use the model would be pretty lost.
204. We’re only seeing results with nudged winds, so have no idea whether to trust E3SM’s stratospheric dynamics. I get that this is outside the scope of this study, but have these been verified? If known, could the authors provide a line on how E3SM’s stratospheric winds perform and maybe a citation on this?
214. What about Cerro Hudson and the other non-Pinatubo eruptions mentioned later in the text? Please say at least that these are included based on the same SO2 dataset, if so.
219. I feel “E3SMv2-presc” would be more suitably given a fuller (1-2 line) mention in the text, as it’s odd to leave all but a brief mention of its existence to a table.
244. Please specify that it’s 75% H2SO4 + 25% H2O “by mass”, as by volume would be different.
278. As above, 75% H2SO4 “by mass”
290. Optional, but it may or may not be worth mentioning the instrument saturation issue that occurred during Pinatubo. This was for instance mentioned in the already cited Quaglia et al, 2023 study.
312. The ammonium sulfate assumption deserves more description. The validation here barely uses/tests E3SM’s actual radiation code (mostly relying on external Mie calculations instead), but this would be an issue for future uses of the model that do, so I think deserves more mention. First off, the imaginary refractive indices are higher for sulfuric acid than ammonium sulfate (see for example a comparison in Gosse et al, 1997). This would bias low the longwave effect, driving the model to cause too much surface cooling. In our own evaluations (not published), we found switching Pinatubo aerosol from ammonium sulfate to sulfuric acid optics increased the longwave forcing by ~50% and reduced the net forcing by 10-15%. Second I wonder if the ammonium sulfate assumption increases density and fallout of the aerosols, which would affect the aerosol properties shown here? Can the authors please comment on this and add a line or two to the text to guide anyone interested in using this model?
Gosse, S. F., Wang, M., Labrie, D., & Chylek, P. (1997). Imaginary part of the refractive index of sulfates and nitrates in the 0.7–2.6-µm spectral region. Applied optics, 36(16), 3622-3634.
323. Please add a line explaining the improvement from E3SM-PA to E3SM-SPA for sulfur burden.
388. Please clarify here whether Cerro Hudson and the other small eruptions are included in all simulations.
392. An important question is why the aerosol size is persistently too small in all models used here. It could be that none of the models include enhancement of coagulation by Van der Waals forces. This was reported to drive a sizable increase in aerosol size in a paper by English et al (2013) that is already cited, and may be worth mentioning in this paragraph (and checking that it isn’t in CESM2-WACCM as used here). For reference, the equations needed to add this are presented in more detail in a study by Sekiya et al (2016). It’s very possible there are other factors, and for one I wonder if the mass of ammonium sulfate (35% higher H2SO4’s true mass) is connected to gravitational settling in a way that would make the coarse particles fall out faster than they should. I’m not sure this needs to be discussed within the manuscript (though it could be helpful to someone wanting to improve the model further), but I hope the authors can share a bit of thinking on this.
Sekiya, T., Sudo, K., & Nagai, T. (2016). Evolution of stratospheric sulfate aerosol from the 1991 Pinatubo eruption: Roles of aerosol microphysical processes. Journal of Geophysical Research: Atmospheres, 121(6), 2911-2938.
395. I wonder if neglecting volcanic ash also has a size influence through lofting, as Stenchikov et al showed in a more recent paper that including ash is the only way to get the aerosol plume to form at an appropriately high level of the stratosphere. This could conceivably slow coagulation by spreading the aerosol out vertically, which would keep aerosol smaller (though I haven’t seen this tested). Maybe not worth mentioning in the text, but if the authors do further tests it could be worth considering.
Stenchikov, G., Ukhov, A., Osipov, S., Ahmadov, R., Grell, G., Cady‐Pereira, K., ... & Iacono, M. (2021). How does a Pinatubo‐size volcanic cloud reach the middle stratosphere?. Journal of Geophysical Research: Atmospheres, 126(10), e2020JD033829.
511. Clarify that the curves are the size modes, please: “dN/dlog D size modes (curves)” or similar
516. Please explain in the first paragraph why you chose to present output from offline Mie code instead of standard model output involving E3SM’s radiation scheme. Are the radiative fundamentals worth an in depth dive here? Is this something novel compared to other model validations? I think this is acceptable, and the benefits of this work are worth advertising better. However, there’s certainly a drawback that we aren’t given enough information from the actual model to have confidence in its ability to produce reliable shortwave scattering (or other radiative effects), which is really what I’d expect in an interactive aerosol model validation study. So a brief explanation is expected.
516. Please also remind the reader that this output is from an offline Mie scattering routine and perhaps link them to Appendix B.
555. What are dotted vs solid curves in Fig. 7? Different modes?
559. Please add a line here to tell the reader why we should care that the model can replicate diffuse and direct radiation breakdown. The relevance for plants is listed extremely briefly in the Introduction, but should be here in slightly more detail (mentioning the influence of radiation type on shadow experienced by plants and photosynthesis, for instance).
564. So unlike all other radiation output in this study, here it actually uses E3SM output and is not just fitting aerosol properties into a radiative transfer model? Why not just show the actual shortwave, longwave, and net forcings, which are the main indicator of stratospheric aerosol impacts on climate? Wouldn’t this be worth being shown in this GMD study, even if there are some remaining issues?
560. The transition between sentences feels like an incomplete comparison. I would amend it to “More substantially, the forward scattered SW […]” or similar.
581. Certainly these quantities are linked, but I think AOD being a “good indicator” of diffuse radiation is unrealistic given the curves have different shapes and can peak months apart.
613. Could the authors please add how the weaker “wavelength dependence of Qs” relates to there being a weaker longwave absorption Reff sensitivity than for shortwave reflection? This is best seen with an x-axis of Reff for fixed wavelengths, as in Fig. 1a of Lacis 2015, but is directly related to Figs. 7 & 10 here via the size parameter. I think the authors’ method of going directly from aerosol properties to the fundamentals of radiative effects is informative, but as this is a GMD climate model validation I think they could better connect this to radiative forcing and climate response. I recommend they take a look at this short Lacis paper that very succinctly puts Qa and Qs into context.
Lacis, A. (2015). Volcanic aerosol radiative properties. Past Global Changes Magazine, 23, 51-51.
613. It’s worth pointing out in the text that Qs and Qa apply to different frequencies and this should be noted when comparing Figs. 7 & 10 (citing the vertical lines). And it would be nice to get a 1-line explanation of the most clear difference between these figures: in Fig. 7 (Qs) differences are right-left, while in Fig. 10 (Qa) they are up-down.
626. Modeling studies can add H2Ov during an eruption to simulate (poorly constrained) direct volcanic emission of water vapor, which would hydrate the stratosphere. May or may not be worth mentioning here that this might have alleviated the issue.
626. (but really the Supplement) The 3 panels in Fig. S11 appear too close together, partly covering the panel titles (“global”, etc). I appreciate that the authors show this data.
637. As in the Fig. 7 caption, the Fig. 10 caption doesn’t say what the solid vs dotted curves are.
641. What is “normalized” here? Maybe this is the same normalization as earlier in the study, but please define it here or at least cite that it is as previously stated.
647. Since – as stated – E3SMv2 has no LWH, please remove it from the legend of Fig. 11. I found myself looking for it but it simply isn’t there.
641. Where do these longwave heating rates come from? The expectation would be that these are from E3SM itself but this does not appear so. Is it a simple equation involving the Qa’s from the previous section? It could be worth showing this, but more definitely there should at least be a small description.
649. Radiative heating rates cannot be directly observed, though there are observations of stratospheric temperature increases. The already cited Mills et al., 2016 study includes a comparison between modeled and radiosonde post-Pinatubo stratospheric temperatures. It may or may not make sense to cite this component of the Mills study here.
667. I wonder if the mass burden differences involve the type of sulfate in each model. Do both E3SMvs-SPA and CESM2-WACCM use ammonium sulfate? Could this bias the aerosols heavy?
689. I’d prefer if the wording were “this suggests that the models will overestimate” instead of “indicates” they “may” do so, as the results show pretty clearly to expect a bias in this direction.
695. Please add a paragraph on to what extent direct use of this model – or methods based on the aerosol properties the model simulates – could result in biased evaluations of climate responses, along with other statements that can aid interpretation of results from future uses of this scheme (see main comments). The authors could make a recommendation only to use E3SMv2-SPA for very similar experiments as performed here (Pinatubo-sized eruptions, nudged winds, little reliance on E3SM's radiation scheme), as certainly the more dissimilar the experiment the less the validation applies. But I expect there could be interest in further uses, so the authors would be well suited to preemptively give guidance (e.g. what uses are suitable, what biases are pertinent, are climate responses trustworthy).
703. Which refractive indices are “assumed across observations and models”? Are these ammonium sulfate or sulfuric acid? Is there a reference to cite?
765. Reff is area mean radius. Please reconcile this.
780. The GitHub for E3SMv2-SPA doesn’t show any indication of being particularly for this stratosphere-optimized version. Is there a particular git branch that should be used?
Technical comments
38. The comma that precedes “CESM2-WACCM” seems unnecessary/odd.
40. “too small of accumulation […] mode” to “overly small accumulation […] mode”.
120. “Simlations” to “Simulations”
138. Totally optional, but maybe spelling out the experiment names could help the reader remember what’s what? I assume “PA” is “prognostic aerosol”?
154. Also optional, but as above, is SPA “stratosphere-optimized prognostic aerosol” experiment or something similar? Could be better to spell this out than letting the reader wonder.
178. “Dg,low” has an obvious meaning given “Dg,hi” is defined above, but this really should also be spelled out before use.
201. The line has some grammatical issues. I would switch “2022) where” to “2022, with”, and then in the following line switch “use” to “using”.
250. (and also 253, 315, etc) Please just ensure to fix all “SAGE-3l” mentions to “SAGE-3λ” by the time this is published.
301. “from the global to the microphysical” sounds like the authors are starting with global and ending with microphysical, where really everything’s jumbled together. Not critical, but maybe can be reworded to avoid this confusion (“across scales global to microphysical” or similar).
304. Reff should be defined within the text before being used here.
304. “small bias” to “bias toward small size” or similar.
381. “mid 1992” to “mid-1992”
390. “Identical” to “identical”
392. “the models” to “these models”
402. “Theselarger” to “These larger”
408. “1993-02” looks a bit awkward in the text (like 1993-2002). “February of 1993” looks nicer, though I’m fine either way as “1993-02” is what’s stated on the figure for brevity.
520. Please rectify that “xeff” is not defined in the text before its use here. It is defined in a figure caption later in the paper.
558. Maybe switch “different” to “presented” or similar, as you already start the sentence with “the differences”.
614. n_Hess isn’t defined in the text, only in the Appendix. Please define it before use.
638. No space in “long wave” for consistency
676. Can the word “also” just be added to make clear that this is a different comparison than the instrument validation: “E3SMv2-SPA also has slightly smaller Reff […]”
1004. “teh” to “the”
1005. Excess space in “SO 2”
Citation: https://doi.org/10.5194/egusphere-2023-3041-RC1 - AC4: 'Reply on RC1', Hunter Brown, 23 Apr 2024
-
RC2: 'Comment on egusphere-2023-3041', Anonymous Referee #2, 21 Feb 2024
Review of “Validating a microphysical prognostic stratospheric aerosol implementation in E3SMv2 using the Mount Pinatubo eruption” by Hunter Brown et al.
The authors present the development of a stratospheric prognostic aerosol (SPA) capability for the Energy Exascale Model, version 2 (E3SMv2) to simulate the stratospheric aerosol formation in the aftermath of large explosive volcanic eruptions. Their implementation includes changes to the 4-mode Modal Aerosol Module microphysics to allow for larger particle growth and more accurate stratospheric aerosol lifetime following the Mt. Pinatubo eruption. Hunter et al. tested their model for the Post Pinatubo period with remote sensing and in situ observations and the interactive chemistry-climate model, CESM2-WACCM. On the global scale, E3SMv2-SPA performs well compared to observational datasets and has similar behavior to CESM2-WACCM. They found that the modeled aerosol effective radius for both versions is consistently lower than satellite and in-situ measurements (max differences of ~30%). Compared to observations, the models also show a higher diffuse radiation at the surface and a larger cooling and an underestimation in stratospheric heating in the models.
Although the manuscript type is declared as a development and technical paper, the content should be placed in the general context of global stratospheric aerosol modelling, otherwise it should be published as a specific technical institute report. The introduction and discussion sections therefore need some substantial improvements. The motivation of the paper could be more clearly stated, and some of the results could be discussed in a broader context. I therefore recommend publication after major revisions, see below.
General comments
In the introduction important literature is missing. Several global stratospheric aerosol modelling studies have been published in the last year. An overview of the development and current state of stratospheric aerosol modelling can be found for example in Kremser et al. (2016) and in Timmreck et al. (2018). In addition, a number of global comparative aerosol modelling studies have been published in recent years, e.g. for background aerosol (Brodowsky et al., 2024), volcanic events (Marshall et al., 2018; Clyne et al., 2021; Quaglia et al., 2023) and artificial sulphur injections (Franke et al., 2021; Weisenstein et al., 2022). I was more than surprised that these studies were completely ignored by the authors. In particular, the results of the Pinatubo study by Quaglia et al. (2023) should be mentioned and discussed in the paper and not just used as a reference to observational data.
I wonder how model specific your results are? How valuable are they to other stratospheric aerosol modellers? I am missing in the discussion section a dedicated paragraph on the strengths and weaknesses of the applied global aerosol models with respect to other global stratospheric aerosol models. Recent intercomparison studies of global aerosol models reveal several difficulties that the current generation of global aerosol models has to deal with. For example, the study by Qualia et al. (2023) comparing the different model results with satellite observations after the eruption of Mt. Pinatubo shows a stronger transport towards the NH extra-tropics, suggesting a much weaker subtropical barrier in all models. How does the spatial aerosol distribution in your model look like? It should be much better as you nudge the winds, so discrepancies can be traced back to other sources. This could be more elaborated with respect to free running models. Nevertheless, it would be nice to see a global distribution of your sulfate burden/AOD also in the paper or in the supplements.
The motivation of the paper could be stress out a bit more. It is also not really clear to me how different your SPA version is from the MAM4 version in WACCAM, except the model reversal and simplified precursor chemistry.
The applied methodology is not sufficiently explained in the manuscript. I miss for example a detailed description how you calculate a spatially averaged aerosol size distribution or effective radii which is not straightforward. A subsection “Methodology” for section 2 would be helpful with more details in the appendix.
Specific comments
- Line 2: “…using observations after the MT. Pinatubo eruption”
- Line 45: “Mt. Pinatubo” sometimes you use “Mt. Pinatubo” sometimes “Pinatubo” only, please be consistent
- Lines 49-51 The fact does a model produce similar results like another model does not make it per se to a viable tool
- Lines 95 ff: Concerning the advantages of sectional aerosol models there is a recent paper by Tilmes et al. (2023) in GMD where they are describing a sectional aerosol microphysical model in CESM2 and compare it with the CESM2 standard version with the Modal Aerosol Model MAM4 for the Pinatubo episode. This paper should be cited and briefly discussed here as well.
- Line 148: What about sedimentation?
- Line s172-174: Any reason why you did not take this process into account in your model.
- Lines 213 -214: This is not what Kremser et al (2016) wrote“Recent modeling studies support lower stratospheric sulfur levels than those inferred from the TOMS and TOVS observations [Dhomse et al.,2014; Mills et al., 2016]. The difference between the initial and the persistent sulfur levels is important and generally supports a more complex development process following a major eruption than has been considered in the past. (Kremser et al., 2016, page 12), Please cite them properly
- Line 451 ff: Solar flux changes: you can also compare your model results to observations by the Earth Radiation Budget Experiment (ERBE) (Barkstrom, 1984; Barkstrom and Smith, 1986). This would an additional approach
- Line 494: Date of the Lascar eruption is not correct
- Line 571: I am wondering why you choose the following latitudes band and not (also) the location of Mauna Loa where you have some data for a direct comparison.
- Lines 639-647: Does an integrated longwave heating rate really make sense here. Would not it be more useful to compare stratospheric temperature profiles here where are at least some observations are available, e.g. Free and Angell (2002) and Free and Lazante (2009).
- Line 667-668: Here you can also refer to work of Clyne et al (2021) and Quaglia et al 2023
Figures:
- Figure 1: The figure caption is very short, lacks information and is difficult to understand, e.g. What does the grey shaded region indicate? What does for “mode sensitivity tests” mean? Are you referring to the global sulfate burden?
- Figure 2: Again, the gray shading?
- Figure 3: Citation of Quaglia et (2023) is misleading here, as it is a model intercomparison paper which uses the observational data for comparison and validation.
Table:
- Table 1: you can get rid of the third column and include the text in the table caption
Literature
- Barkstrom, B. R.: The Earth Radiation Budget Experiment (ERBE), Am. Meteorol. Soc., 65, 1170–1185, 1984.
- Barkstrom, B. R. and Smith, G. L.: The Earth Radiation Budget Experiment: Science and implementation, Rev. Geophys., 24, doi:10.1029/RG024i002p00379, 1986.
- Brodowsky, C., Analysis of the global atmospheric background sulfur budget in a multi-model framework, EGUsphere [preprint], https://doi.org/10.5194/egusphere-2023-1655, 2023.
- Clyne, M., et al..: Model physics and chemistry causing intermodel disagreement within the VolMIP-Tambora Interactive Stratospheric Aerosol ensemble, Atmos. Chem. Phys., 21, 3317–3343, https://doi.org/10.5194/acp-21-3317-2021, 2021.
- Free, M., and J. K. Angell, 2002: Effect of volcanoes on the vertical temperature profile in radiosonde data. J. Geophys. Res., 107,4101, doi:10.1029/2001JD001128
- Free, M., and J. Lanzante, 2009: Effect of Volcanic Eruptions on the Vertical Temperature Profile in Radiosonde Data and Climate Models. Climate, 22, 2925–2939, https://doi.org/10.1175/2008JCLI2562.1.
- Kremser, S., et al.: Stratospheric aerosol – Observations, processes, and impact on climate, Rev. Geophys., 54, 1–58,https://doi.org/10.1002/2015RG000511, 2016.
- Marshall, L., et al..: Multi-model comparison of the volcanic sulfate deposition from the 1815 eruption of Mt. Tambora, Atmos. Chem. Phys., 18, 2307–2328, https://doi.org/10.5194/acp-18-2307-2018, 2018.
- Quaglia, I.et al.: Interactive stratospheric aerosol models' response to different amounts and altitudes of SO2 injection during the 1991 Pinatubo eruption, Atmos. Chem. Phys., 23, 921–948, https://doi.org/10.5194/acp-23-921-2023, 2023.
- Tilmes, S.et al.: Description and performance of a sectional aerosol microphysical model in the Community Earth System Model (CESM2), Geosci. Model Dev., 16, 6087–6125, https://doi.org/10.5194/gmd-16-6087-2023, 2023.
- Timmreck, C. et al.: The Interactive Stratospheric Aerosol Model Intercomparison Project (ISA-MIP): Motivation and experimental design, Geosci. Model Dev., 11, 2581-2608, doi.org/10.5194/gmd-11-2581-2018, 2018.
- Toohey, M., Krüger, K., Niemeier, U., and Timmreck, C.: The influence of eruption season on the global aerosol evolution and radiative impact of tropical volcanic eruptions, Atmos. Chem. Phys.,11, 12351–12367, doi:10.5194/acp-11-12351-2011, 2011.
- Weisenstein, D. K., Visioni, D., Franke, H., Niemeier, U., Vattioni, S., Chiodo, G., Peter, T., and Keith, D. W.: An interactive stratospheric aerosol model intercomparison of solar geoengineering by stratospheric injection of SO2 or accumulation-mode sulfuric acid aerosols,Atmos. Chem. Phys., 22, 2955–2973, https://doi.org/10.5194/acp-22-2955-2022, 2022.
Citation: https://doi.org/10.5194/egusphere-2023-3041-RC2 - AC3: 'Reply on RC2', Hunter Brown, 23 Apr 2024
Interactive discussion
Status: closed
-
CEC1: 'Comment on egusphere-2023-3041', Juan Antonio Añel, 26 Jan 2024
Dear authors,
Unfortunately, after checking your manuscript, it has come to our attention that it does not comply with our "Code and Data Policy".
https://www.geoscientific-model-development.net/policies/code_and_data_policy.html
You have archived your code on GitHub. However, GitHub is not a suitable repository for scientific publication. GitHub itself instructs authors to use other alternatives for long-term archival and publishing, such as Zenodo. Therefore, please, publish your code in one of the appropriate repositories, and reply to this comment with the relevant information (link and DOI) as soon as possible, as it should be available before the Discussions stage.In this way, if you do not fix this problem, we will have to reject your manuscript for publication in our journal. I should note that, actually, your manuscript should not have been accepted in Discussions, given this lack of compliance with our policy. Therefore, the current situation with your manuscript is irregular.
Also, you must include in a potentially reviewed version of your manuscript the modified 'Code and Data Availability' section, the DOI of the code (and another DOI for the dataset if necessary).
Please, remember including a license for your code in the new repository. If you do not include a license, the code continues to be your property, and nobody can use it. Therefore, when uploading the model's code to Zenodo, you could want to choose a free software/open-source (FLOSS) license. We recommend the GPLv3. You only need to include the file 'https://www.gnu.org/licenses/gpl-3.0.txt' as LICENSE.txt with your code. Also, you can choose other options that Zenodo provides: GPLv2, Apache License, MIT License, etc.
Juan A. Añel
Geosci. Model Dev. Executive EditorCitation: https://doi.org/10.5194/egusphere-2023-3041-CEC1 -
AC1: 'Reply on CEC1', Hunter Brown, 30 Jan 2024
Hello Juan,
Sorry about the incorrect referencing for the model code. I have uploaded our E3SM code to zenodo at the following link (https://doi.org/10.5281/zenodo.10593881) and have modified the code availability statement to read as follows:
" 6. Code Availability
The model code based used to generate E3SMv2-SPA and E3SMv2-PA – along with information for how to access the publicly available CESM2-WACCM code base – can be found on Zenodo at https://doi.org/10.5281/zenodo.10593881. Plotting and processing scripts used in the analyses of this paper can be found on Figshare at https://doi.org/10.6084/m9.figshare.24844815.v1."
Thank you and let me know if you need anything else.
-Hunter
Citation: https://doi.org/10.5194/egusphere-2023-3041-AC1 -
CEC2: 'Reply on AC1', Juan Antonio Añel, 31 Jan 2024
Dear authors,
Thanks for publishing your code in Zenodo. However, the extension of the file you uploaded containing the code is wrong. You have named the file "sandia_e3sm_spa_code.gz". However, the extension of the file is wrong, as it is not a file compressed with the ZIP algorithm but a simple tarball file. See below:
$ file sandia_e3sm_spa_code.gz
sandia_e3sm_spa_code.gz: POSIX tar archive
This can make some people try to access the file and can not do it because their operating system or environment will not be able to recognize the file correctly, leading to an error when opening it. Therefore, please, upload a new version of your file to the repository (and post here the link and update the Code Availability section accordingly), which should be named with the .tar extension. Another option is to keep the name but upload a compressed ZIP file.Many thanks,
Juan A. Añel
Geosci. Model Dev. Executive Editor
Citation: https://doi.org/10.5194/egusphere-2023-3041-CEC2 -
AC2: 'Reply on CEC2', Hunter Brown, 01 Feb 2024
Sorry about that. I have changed it to a .tar file and have updated the code availability comment with the correct zenodo version link:
" 6. Code Availability
The model code based used to generate E3SMv2-SPA and E3SMv2-PA – along with information for how to access the publicly available CESM2-WACCM code base – can be found on Zenodo at https://doi.org/10.5281/zenodo.10602682. Plotting and processing scripts used in the analyses of this paper can be found on Figshare at https://doi.org/10.6084/m9.figshare.24844815.v1."
Let me know if there are any additional issues.
Best, Hunter
Citation: https://doi.org/10.5194/egusphere-2023-3041-AC2
-
AC2: 'Reply on CEC2', Hunter Brown, 01 Feb 2024
-
CEC2: 'Reply on AC1', Juan Antonio Añel, 31 Jan 2024
-
AC1: 'Reply on CEC1', Hunter Brown, 30 Jan 2024
-
RC1: 'Comment on egusphere-2023-3041', Zachary McGraw, 18 Feb 2024
Brown et al present a detailed and well-written documentation of their modifications and validation of E3SM, which I believe following some mostly modest textual changes can be ready for publication. Replicating the 1991 Mt. Pinatubo eruption is an important test case for climate models having interactive stratospheric microphysics. These models exist partly to understand the aerosol development and evolution during observed eruptions, and more commonly as a means to evaluate cases that lack aerosol data that otherwise could be prescribed into simpler model versions (e.g. eruptions of the distant past, solar radiation modification). As Pinatubo is the most clearly observed case of stratospheric aerosols altering global climate, being able to reasonably replicate its aerosol layer is essential for establishing credence in further experiments. The presented E3SM version is not flawless here and the main code changes are reboots from a related model (Mills et al, 2016). But documenting and testing this version is important for the interpretation of future model uses, remaining biases are honestly presented, and this does replicate aerosol properties slightly above average compared to other models (cf ISA-MIP). I believe it’s important that the authors add discussion of how the reported issues can affect future uses of the model, and better delineate what is and is not verified by this evaluation, which is complicated by reliance on offline radiation calculations instead of full reliance on E3SM, as well as the use of nudged winds. I also think the authors can better explain the study’s purpose and clarify some methodological choices, but if the authors make a reasonable effort to address these comments (detailed below) I think this can be suitable for publication relatively quickly, and so have selected minor revisions.
Main comments
The study reports multiple reasons to expect this model version will overstate stratospheric aerosol impacts on climate but does not include a paragraph explaining the ramifications for future uses. I expect this validation study was made largely to be cited by future studies on non-Pinatubo experiments as a reason to have confidence in the model version, or could be even if not the intention. Biases are hence important to put into context. The two issues of overly small aerosols and ammonium sulfate optical constants (to represent sulfuric acid, which absorbs more LW) will both exaggerate the surface cooling, as well as the precipitation response to a cooler lower troposphere. For a case of roughly similar stratospheric mass as Pinatubo (potentially including SRM), the results suggest this model version would give cooling – which is not presented here – on the edge of what should be acceptable as ‘scientifically ready’ (I approximate a net forcing bias of ~25-50% based on these two biases). If this model version were used for an eruption multiple times the mass of Pinatubo – and maybe there is no intent for this, but an external user could presumably do so on their own – the net forcing bias would balloon, due to the shortwave and longwave effects becoming both very large and more closely offsetting one another. There’s a long tradition in the volcano and SRM communities of using models with strongly exaggerated aerosol forcings as the basis for arguments on dire consequences (or detectability) of stratospheric aerosols, so it’s important for caveats to be laid out at this early stage. Could the authors please add a paragraph (~5-8 sentences) to the Discussion section to guide future users on what their results imply, putting into a useful context some of the issues mentioned here? Also please make clear what is and is not verified in this study, as E3SM’s stratospheric winds and radiation scheme being sidelined in these experiments complicates the ability for studies on future experiments to point to this study as validation of model reliability. I feel the authors have been transparent on their specific results, which is commendable, but just need to tie things together for future users and readers of upcoming works that use this model version.
I also think this manuscript would be easier to appreciate if the authors add 3-5 sentences explaining the purpose of this study and model version in the Introduction, as currently this is extremely brief (lines 64-66). First off, to please explain more clearly why the focus is on the Pinatubo test case and why interactive aerosol microphysics is useful to represent. This model version is surely not meant to just replicate Pinatubo’s impacts, for which interactive aerosols do not need to be simulated because there are satellite retrievals of extinction and size retrievals. I offer some reasoning on why interactive models and the Pinatubo test case are important in this review’s first paragraphs – maybe the authors’ reasons are different or they are unable to reveal specifics, but it would be good to see more explanation here. Second, is this version only for very specific stratospheric aerosol experiments by a small group, or can (and should?) anyone familiar with E3SM easily run the “SPA” version in the GitHub or make the modifications themselves for diverse stratospheric aerosol cases? Third, the authors made an effort by retuning to get the troposphere right, so I’d like to see some statement on whether the authors view this as satisfactory for a full experiment including troposphere and stratospheric responses (e.g. historical runs), or if this is unknown as more validation would be needed. Anything the authors can contribute to give the reader a better sense of this model version’s reason for existence and its suitable uses.
The manuscript would be more useful if it included the magnitudes of shortwave, longwave, and net forcings, as well as stratospheric warming, all of which should be standard E3SM outputs, attainable as eruption years minus pre-eruption period. So why not show these or any results that are a function of the model’s radiation code beside diffuse and direct radiation? Is this because the optical constants for sulfuric acid aren’t well represented, or E3SM’s radiation scheme isn’t yet set up to feed in interactive stratospheric composition? It’s understandable if only the aerosol properties are being verified within the scope of the present study, and possibly there are pertinent issues with the E3SM radiation scheme that are difficult to resolve. But to get no explanation is frustrating for the reader wanting to know whether or not this model can reliably replicate stratospheric sulfate’s climate impacts, and even more so for anyone trying to figure out if they want to use this model. As this is a GMD article on stratospheric aerosols in a climate model, can the authors at least be upfront in the manuscript (~2-3 sentences in Methods and/or Results) about why they don’t show the most climate-relevant outputs?
Specific comments
38. Can the authors please word the CESM2-WACCM part of this sentence a bit better, as it would seem obvious these would give similar results. CESM2-WACCM is an odd model choice for comparison, given it has much of the same code as E3SM so doesn’t serve as much of a benchmark.
50. It seems odd to tout this E3SM model as a useful alternative to an older model that uses much of the same code, including the same or very similar aerosol scheme. I would word this more logically or just focus on the E3SM-to-observation comparison here.
60. “net primary productivity of plants” or similar, as “productivity” alone is vague.
68. I would say you’re only “validating” against the observations, and separately that you’re comparing against CESM, as CESM is just a model whereas the observations are – despite their own flaws and uncertainties – the standard approximation of truth. Can the authors also please briefly explain their choice of CESM2-WACCM here? Showing the ISA-MIP Pinatubo models (Quaglia et al, 2023, already cited in the manuscript) would have given a better impression of this model’s performance against its peers. I understand that replicating an already verified model having many of the same features serves as a sanity check (and maybe some of the unique aspects of E3SM lead to improvement?), but this is worth a 1-line explanation in the text.
70. Saying “most climate models” use GloSSAC isn’t accurate. For some historical eruptions it’s an option, but then there’s the CMIP dataset cited elsewhere in this study (SAGE-3λ). And for eruptions in the distant past or hypothetical eruption cases, simplified forcing generators like Easy Volcanic Aerosol (Toohey et al., 2016) are now the standard option. I would just amend this into a more general statement.
Toohey, M., Stevens, B., Schmidt, H., & Timmreck, C. (2016). Easy Volcanic Aerosol (EVA v1. 0): an idealized forcing generator for climate simulations. Geoscientific Model Development, 9(11), 4049-4070.
82. “more complete approach”, maybe. But mostly these interactive models are used for cases where we lack suitable observations, or are isolating a particular microphysical effect (since we observe properties, but not processes). I think the need for good interactive aerosol models can be described better, which would also help tout why the rigorous technical work here is useful (see main comments).
107. Please summarize around this line the model version’s purpose (see main comments). Can anyone use this scheme? For what purposes is it suitable and for what experiments does this validation apply?
110-111. I’m not convinced that this study pays “more attention to […] global and regional climate impacts” than the cited Mills et al 2017 study. That study actually shows radiative forcings and surface temperature responses, while this one does not. I think there are novel aspects to this study – the radiation sections are for instance quite different from the two Mills et al studies – and that the authors should more accurately represent their uniqueness here.
158. It looks from Table S1 like the mode size cutoffs are exactly the same as in the Mills et al study, and the only difference is the dust and sea salt tuning. Can the authors please make this clear in the manuscript text?
180. It’s a bit odd the authors don’t show any maps of tropospheric aerosol to back these statements, though I’m willing to accept this is outside the range of this study. Can the authors at least make a statement if the model as modified here is ready for experiments where both the troposphere and stratosphere are important, or if further validation effort is needed then please say so. Is there any reason not to use this over the current E3SM, beside maybe the unavailability of long control runs? As is I feel anyone who reads this thinking they may want to use the model would be pretty lost.
204. We’re only seeing results with nudged winds, so have no idea whether to trust E3SM’s stratospheric dynamics. I get that this is outside the scope of this study, but have these been verified? If known, could the authors provide a line on how E3SM’s stratospheric winds perform and maybe a citation on this?
214. What about Cerro Hudson and the other non-Pinatubo eruptions mentioned later in the text? Please say at least that these are included based on the same SO2 dataset, if so.
219. I feel “E3SMv2-presc” would be more suitably given a fuller (1-2 line) mention in the text, as it’s odd to leave all but a brief mention of its existence to a table.
244. Please specify that it’s 75% H2SO4 + 25% H2O “by mass”, as by volume would be different.
278. As above, 75% H2SO4 “by mass”
290. Optional, but it may or may not be worth mentioning the instrument saturation issue that occurred during Pinatubo. This was for instance mentioned in the already cited Quaglia et al, 2023 study.
312. The ammonium sulfate assumption deserves more description. The validation here barely uses/tests E3SM’s actual radiation code (mostly relying on external Mie calculations instead), but this would be an issue for future uses of the model that do, so I think deserves more mention. First off, the imaginary refractive indices are higher for sulfuric acid than ammonium sulfate (see for example a comparison in Gosse et al, 1997). This would bias low the longwave effect, driving the model to cause too much surface cooling. In our own evaluations (not published), we found switching Pinatubo aerosol from ammonium sulfate to sulfuric acid optics increased the longwave forcing by ~50% and reduced the net forcing by 10-15%. Second I wonder if the ammonium sulfate assumption increases density and fallout of the aerosols, which would affect the aerosol properties shown here? Can the authors please comment on this and add a line or two to the text to guide anyone interested in using this model?
Gosse, S. F., Wang, M., Labrie, D., & Chylek, P. (1997). Imaginary part of the refractive index of sulfates and nitrates in the 0.7–2.6-µm spectral region. Applied optics, 36(16), 3622-3634.
323. Please add a line explaining the improvement from E3SM-PA to E3SM-SPA for sulfur burden.
388. Please clarify here whether Cerro Hudson and the other small eruptions are included in all simulations.
392. An important question is why the aerosol size is persistently too small in all models used here. It could be that none of the models include enhancement of coagulation by Van der Waals forces. This was reported to drive a sizable increase in aerosol size in a paper by English et al (2013) that is already cited, and may be worth mentioning in this paragraph (and checking that it isn’t in CESM2-WACCM as used here). For reference, the equations needed to add this are presented in more detail in a study by Sekiya et al (2016). It’s very possible there are other factors, and for one I wonder if the mass of ammonium sulfate (35% higher H2SO4’s true mass) is connected to gravitational settling in a way that would make the coarse particles fall out faster than they should. I’m not sure this needs to be discussed within the manuscript (though it could be helpful to someone wanting to improve the model further), but I hope the authors can share a bit of thinking on this.
Sekiya, T., Sudo, K., & Nagai, T. (2016). Evolution of stratospheric sulfate aerosol from the 1991 Pinatubo eruption: Roles of aerosol microphysical processes. Journal of Geophysical Research: Atmospheres, 121(6), 2911-2938.
395. I wonder if neglecting volcanic ash also has a size influence through lofting, as Stenchikov et al showed in a more recent paper that including ash is the only way to get the aerosol plume to form at an appropriately high level of the stratosphere. This could conceivably slow coagulation by spreading the aerosol out vertically, which would keep aerosol smaller (though I haven’t seen this tested). Maybe not worth mentioning in the text, but if the authors do further tests it could be worth considering.
Stenchikov, G., Ukhov, A., Osipov, S., Ahmadov, R., Grell, G., Cady‐Pereira, K., ... & Iacono, M. (2021). How does a Pinatubo‐size volcanic cloud reach the middle stratosphere?. Journal of Geophysical Research: Atmospheres, 126(10), e2020JD033829.
511. Clarify that the curves are the size modes, please: “dN/dlog D size modes (curves)” or similar
516. Please explain in the first paragraph why you chose to present output from offline Mie code instead of standard model output involving E3SM’s radiation scheme. Are the radiative fundamentals worth an in depth dive here? Is this something novel compared to other model validations? I think this is acceptable, and the benefits of this work are worth advertising better. However, there’s certainly a drawback that we aren’t given enough information from the actual model to have confidence in its ability to produce reliable shortwave scattering (or other radiative effects), which is really what I’d expect in an interactive aerosol model validation study. So a brief explanation is expected.
516. Please also remind the reader that this output is from an offline Mie scattering routine and perhaps link them to Appendix B.
555. What are dotted vs solid curves in Fig. 7? Different modes?
559. Please add a line here to tell the reader why we should care that the model can replicate diffuse and direct radiation breakdown. The relevance for plants is listed extremely briefly in the Introduction, but should be here in slightly more detail (mentioning the influence of radiation type on shadow experienced by plants and photosynthesis, for instance).
564. So unlike all other radiation output in this study, here it actually uses E3SM output and is not just fitting aerosol properties into a radiative transfer model? Why not just show the actual shortwave, longwave, and net forcings, which are the main indicator of stratospheric aerosol impacts on climate? Wouldn’t this be worth being shown in this GMD study, even if there are some remaining issues?
560. The transition between sentences feels like an incomplete comparison. I would amend it to “More substantially, the forward scattered SW […]” or similar.
581. Certainly these quantities are linked, but I think AOD being a “good indicator” of diffuse radiation is unrealistic given the curves have different shapes and can peak months apart.
613. Could the authors please add how the weaker “wavelength dependence of Qs” relates to there being a weaker longwave absorption Reff sensitivity than for shortwave reflection? This is best seen with an x-axis of Reff for fixed wavelengths, as in Fig. 1a of Lacis 2015, but is directly related to Figs. 7 & 10 here via the size parameter. I think the authors’ method of going directly from aerosol properties to the fundamentals of radiative effects is informative, but as this is a GMD climate model validation I think they could better connect this to radiative forcing and climate response. I recommend they take a look at this short Lacis paper that very succinctly puts Qa and Qs into context.
Lacis, A. (2015). Volcanic aerosol radiative properties. Past Global Changes Magazine, 23, 51-51.
613. It’s worth pointing out in the text that Qs and Qa apply to different frequencies and this should be noted when comparing Figs. 7 & 10 (citing the vertical lines). And it would be nice to get a 1-line explanation of the most clear difference between these figures: in Fig. 7 (Qs) differences are right-left, while in Fig. 10 (Qa) they are up-down.
626. Modeling studies can add H2Ov during an eruption to simulate (poorly constrained) direct volcanic emission of water vapor, which would hydrate the stratosphere. May or may not be worth mentioning here that this might have alleviated the issue.
626. (but really the Supplement) The 3 panels in Fig. S11 appear too close together, partly covering the panel titles (“global”, etc). I appreciate that the authors show this data.
637. As in the Fig. 7 caption, the Fig. 10 caption doesn’t say what the solid vs dotted curves are.
641. What is “normalized” here? Maybe this is the same normalization as earlier in the study, but please define it here or at least cite that it is as previously stated.
647. Since – as stated – E3SMv2 has no LWH, please remove it from the legend of Fig. 11. I found myself looking for it but it simply isn’t there.
641. Where do these longwave heating rates come from? The expectation would be that these are from E3SM itself but this does not appear so. Is it a simple equation involving the Qa’s from the previous section? It could be worth showing this, but more definitely there should at least be a small description.
649. Radiative heating rates cannot be directly observed, though there are observations of stratospheric temperature increases. The already cited Mills et al., 2016 study includes a comparison between modeled and radiosonde post-Pinatubo stratospheric temperatures. It may or may not make sense to cite this component of the Mills study here.
667. I wonder if the mass burden differences involve the type of sulfate in each model. Do both E3SMvs-SPA and CESM2-WACCM use ammonium sulfate? Could this bias the aerosols heavy?
689. I’d prefer if the wording were “this suggests that the models will overestimate” instead of “indicates” they “may” do so, as the results show pretty clearly to expect a bias in this direction.
695. Please add a paragraph on to what extent direct use of this model – or methods based on the aerosol properties the model simulates – could result in biased evaluations of climate responses, along with other statements that can aid interpretation of results from future uses of this scheme (see main comments). The authors could make a recommendation only to use E3SMv2-SPA for very similar experiments as performed here (Pinatubo-sized eruptions, nudged winds, little reliance on E3SM's radiation scheme), as certainly the more dissimilar the experiment the less the validation applies. But I expect there could be interest in further uses, so the authors would be well suited to preemptively give guidance (e.g. what uses are suitable, what biases are pertinent, are climate responses trustworthy).
703. Which refractive indices are “assumed across observations and models”? Are these ammonium sulfate or sulfuric acid? Is there a reference to cite?
765. Reff is area mean radius. Please reconcile this.
780. The GitHub for E3SMv2-SPA doesn’t show any indication of being particularly for this stratosphere-optimized version. Is there a particular git branch that should be used?
Technical comments
38. The comma that precedes “CESM2-WACCM” seems unnecessary/odd.
40. “too small of accumulation […] mode” to “overly small accumulation […] mode”.
120. “Simlations” to “Simulations”
138. Totally optional, but maybe spelling out the experiment names could help the reader remember what’s what? I assume “PA” is “prognostic aerosol”?
154. Also optional, but as above, is SPA “stratosphere-optimized prognostic aerosol” experiment or something similar? Could be better to spell this out than letting the reader wonder.
178. “Dg,low” has an obvious meaning given “Dg,hi” is defined above, but this really should also be spelled out before use.
201. The line has some grammatical issues. I would switch “2022) where” to “2022, with”, and then in the following line switch “use” to “using”.
250. (and also 253, 315, etc) Please just ensure to fix all “SAGE-3l” mentions to “SAGE-3λ” by the time this is published.
301. “from the global to the microphysical” sounds like the authors are starting with global and ending with microphysical, where really everything’s jumbled together. Not critical, but maybe can be reworded to avoid this confusion (“across scales global to microphysical” or similar).
304. Reff should be defined within the text before being used here.
304. “small bias” to “bias toward small size” or similar.
381. “mid 1992” to “mid-1992”
390. “Identical” to “identical”
392. “the models” to “these models”
402. “Theselarger” to “These larger”
408. “1993-02” looks a bit awkward in the text (like 1993-2002). “February of 1993” looks nicer, though I’m fine either way as “1993-02” is what’s stated on the figure for brevity.
520. Please rectify that “xeff” is not defined in the text before its use here. It is defined in a figure caption later in the paper.
558. Maybe switch “different” to “presented” or similar, as you already start the sentence with “the differences”.
614. n_Hess isn’t defined in the text, only in the Appendix. Please define it before use.
638. No space in “long wave” for consistency
676. Can the word “also” just be added to make clear that this is a different comparison than the instrument validation: “E3SMv2-SPA also has slightly smaller Reff […]”
1004. “teh” to “the”
1005. Excess space in “SO 2”
Citation: https://doi.org/10.5194/egusphere-2023-3041-RC1 - AC4: 'Reply on RC1', Hunter Brown, 23 Apr 2024
-
RC2: 'Comment on egusphere-2023-3041', Anonymous Referee #2, 21 Feb 2024
Review of “Validating a microphysical prognostic stratospheric aerosol implementation in E3SMv2 using the Mount Pinatubo eruption” by Hunter Brown et al.
The authors present the development of a stratospheric prognostic aerosol (SPA) capability for the Energy Exascale Model, version 2 (E3SMv2) to simulate the stratospheric aerosol formation in the aftermath of large explosive volcanic eruptions. Their implementation includes changes to the 4-mode Modal Aerosol Module microphysics to allow for larger particle growth and more accurate stratospheric aerosol lifetime following the Mt. Pinatubo eruption. Hunter et al. tested their model for the Post Pinatubo period with remote sensing and in situ observations and the interactive chemistry-climate model, CESM2-WACCM. On the global scale, E3SMv2-SPA performs well compared to observational datasets and has similar behavior to CESM2-WACCM. They found that the modeled aerosol effective radius for both versions is consistently lower than satellite and in-situ measurements (max differences of ~30%). Compared to observations, the models also show a higher diffuse radiation at the surface and a larger cooling and an underestimation in stratospheric heating in the models.
Although the manuscript type is declared as a development and technical paper, the content should be placed in the general context of global stratospheric aerosol modelling, otherwise it should be published as a specific technical institute report. The introduction and discussion sections therefore need some substantial improvements. The motivation of the paper could be more clearly stated, and some of the results could be discussed in a broader context. I therefore recommend publication after major revisions, see below.
General comments
In the introduction important literature is missing. Several global stratospheric aerosol modelling studies have been published in the last year. An overview of the development and current state of stratospheric aerosol modelling can be found for example in Kremser et al. (2016) and in Timmreck et al. (2018). In addition, a number of global comparative aerosol modelling studies have been published in recent years, e.g. for background aerosol (Brodowsky et al., 2024), volcanic events (Marshall et al., 2018; Clyne et al., 2021; Quaglia et al., 2023) and artificial sulphur injections (Franke et al., 2021; Weisenstein et al., 2022). I was more than surprised that these studies were completely ignored by the authors. In particular, the results of the Pinatubo study by Quaglia et al. (2023) should be mentioned and discussed in the paper and not just used as a reference to observational data.
I wonder how model specific your results are? How valuable are they to other stratospheric aerosol modellers? I am missing in the discussion section a dedicated paragraph on the strengths and weaknesses of the applied global aerosol models with respect to other global stratospheric aerosol models. Recent intercomparison studies of global aerosol models reveal several difficulties that the current generation of global aerosol models has to deal with. For example, the study by Qualia et al. (2023) comparing the different model results with satellite observations after the eruption of Mt. Pinatubo shows a stronger transport towards the NH extra-tropics, suggesting a much weaker subtropical barrier in all models. How does the spatial aerosol distribution in your model look like? It should be much better as you nudge the winds, so discrepancies can be traced back to other sources. This could be more elaborated with respect to free running models. Nevertheless, it would be nice to see a global distribution of your sulfate burden/AOD also in the paper or in the supplements.
The motivation of the paper could be stress out a bit more. It is also not really clear to me how different your SPA version is from the MAM4 version in WACCAM, except the model reversal and simplified precursor chemistry.
The applied methodology is not sufficiently explained in the manuscript. I miss for example a detailed description how you calculate a spatially averaged aerosol size distribution or effective radii which is not straightforward. A subsection “Methodology” for section 2 would be helpful with more details in the appendix.
Specific comments
- Line 2: “…using observations after the MT. Pinatubo eruption”
- Line 45: “Mt. Pinatubo” sometimes you use “Mt. Pinatubo” sometimes “Pinatubo” only, please be consistent
- Lines 49-51 The fact does a model produce similar results like another model does not make it per se to a viable tool
- Lines 95 ff: Concerning the advantages of sectional aerosol models there is a recent paper by Tilmes et al. (2023) in GMD where they are describing a sectional aerosol microphysical model in CESM2 and compare it with the CESM2 standard version with the Modal Aerosol Model MAM4 for the Pinatubo episode. This paper should be cited and briefly discussed here as well.
- Line 148: What about sedimentation?
- Line s172-174: Any reason why you did not take this process into account in your model.
- Lines 213 -214: This is not what Kremser et al (2016) wrote“Recent modeling studies support lower stratospheric sulfur levels than those inferred from the TOMS and TOVS observations [Dhomse et al.,2014; Mills et al., 2016]. The difference between the initial and the persistent sulfur levels is important and generally supports a more complex development process following a major eruption than has been considered in the past. (Kremser et al., 2016, page 12), Please cite them properly
- Line 451 ff: Solar flux changes: you can also compare your model results to observations by the Earth Radiation Budget Experiment (ERBE) (Barkstrom, 1984; Barkstrom and Smith, 1986). This would an additional approach
- Line 494: Date of the Lascar eruption is not correct
- Line 571: I am wondering why you choose the following latitudes band and not (also) the location of Mauna Loa where you have some data for a direct comparison.
- Lines 639-647: Does an integrated longwave heating rate really make sense here. Would not it be more useful to compare stratospheric temperature profiles here where are at least some observations are available, e.g. Free and Angell (2002) and Free and Lazante (2009).
- Line 667-668: Here you can also refer to work of Clyne et al (2021) and Quaglia et al 2023
Figures:
- Figure 1: The figure caption is very short, lacks information and is difficult to understand, e.g. What does the grey shaded region indicate? What does for “mode sensitivity tests” mean? Are you referring to the global sulfate burden?
- Figure 2: Again, the gray shading?
- Figure 3: Citation of Quaglia et (2023) is misleading here, as it is a model intercomparison paper which uses the observational data for comparison and validation.
Table:
- Table 1: you can get rid of the third column and include the text in the table caption
Literature
- Barkstrom, B. R.: The Earth Radiation Budget Experiment (ERBE), Am. Meteorol. Soc., 65, 1170–1185, 1984.
- Barkstrom, B. R. and Smith, G. L.: The Earth Radiation Budget Experiment: Science and implementation, Rev. Geophys., 24, doi:10.1029/RG024i002p00379, 1986.
- Brodowsky, C., Analysis of the global atmospheric background sulfur budget in a multi-model framework, EGUsphere [preprint], https://doi.org/10.5194/egusphere-2023-1655, 2023.
- Clyne, M., et al..: Model physics and chemistry causing intermodel disagreement within the VolMIP-Tambora Interactive Stratospheric Aerosol ensemble, Atmos. Chem. Phys., 21, 3317–3343, https://doi.org/10.5194/acp-21-3317-2021, 2021.
- Free, M., and J. K. Angell, 2002: Effect of volcanoes on the vertical temperature profile in radiosonde data. J. Geophys. Res., 107,4101, doi:10.1029/2001JD001128
- Free, M., and J. Lanzante, 2009: Effect of Volcanic Eruptions on the Vertical Temperature Profile in Radiosonde Data and Climate Models. Climate, 22, 2925–2939, https://doi.org/10.1175/2008JCLI2562.1.
- Kremser, S., et al.: Stratospheric aerosol – Observations, processes, and impact on climate, Rev. Geophys., 54, 1–58,https://doi.org/10.1002/2015RG000511, 2016.
- Marshall, L., et al..: Multi-model comparison of the volcanic sulfate deposition from the 1815 eruption of Mt. Tambora, Atmos. Chem. Phys., 18, 2307–2328, https://doi.org/10.5194/acp-18-2307-2018, 2018.
- Quaglia, I.et al.: Interactive stratospheric aerosol models' response to different amounts and altitudes of SO2 injection during the 1991 Pinatubo eruption, Atmos. Chem. Phys., 23, 921–948, https://doi.org/10.5194/acp-23-921-2023, 2023.
- Tilmes, S.et al.: Description and performance of a sectional aerosol microphysical model in the Community Earth System Model (CESM2), Geosci. Model Dev., 16, 6087–6125, https://doi.org/10.5194/gmd-16-6087-2023, 2023.
- Timmreck, C. et al.: The Interactive Stratospheric Aerosol Model Intercomparison Project (ISA-MIP): Motivation and experimental design, Geosci. Model Dev., 11, 2581-2608, doi.org/10.5194/gmd-11-2581-2018, 2018.
- Toohey, M., Krüger, K., Niemeier, U., and Timmreck, C.: The influence of eruption season on the global aerosol evolution and radiative impact of tropical volcanic eruptions, Atmos. Chem. Phys.,11, 12351–12367, doi:10.5194/acp-11-12351-2011, 2011.
- Weisenstein, D. K., Visioni, D., Franke, H., Niemeier, U., Vattioni, S., Chiodo, G., Peter, T., and Keith, D. W.: An interactive stratospheric aerosol model intercomparison of solar geoengineering by stratospheric injection of SO2 or accumulation-mode sulfuric acid aerosols,Atmos. Chem. Phys., 22, 2955–2973, https://doi.org/10.5194/acp-22-2955-2022, 2022.
Citation: https://doi.org/10.5194/egusphere-2023-3041-RC2 - AC3: 'Reply on RC2', Hunter Brown, 23 Apr 2024
Peer review completion
Journal article(s) based on this preprint
Viewed
HTML | XML | Total | Supplement | BibTeX | EndNote | |
---|---|---|---|---|---|---|
517 | 101 | 35 | 653 | 37 | 16 | 13 |
- HTML: 517
- PDF: 101
- XML: 35
- Total: 653
- Supplement: 37
- BibTeX: 16
- EndNote: 13
Viewed (geographical distribution)
Country | # | Views | % |
---|
Total: | 0 |
HTML: | 0 |
PDF: | 0 |
XML: | 0 |
- 1
Cited
Benjamin Wagman
Diana Bull
Kara Peterson
Benjamin Hillman
Xiaohong Liu
Ziming Ke
The requested preprint has a corresponding peer-reviewed final revised paper. You are encouraged to refer to the final revised version.
- Preprint
(5223 KB) - Metadata XML
-
Supplement
(4205 KB) - BibTeX
- EndNote
- Final revised paper