the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
The ACCESS-AM2 climate model strongly underestimates aerosol concentration in the Southern Ocean, but improving it could be problematic for the modelled climate system
Abstract. The interaction of natural marine aerosol with clouds and radiation is a significant source of climate model uncertainty. The Southern Ocean represents a key area to understand these interactions, and a region where significant model biases exist. Here we provide an evaluation of the Australian Community Climate and Earth System Simulator atmosphere model which includes a double-moment aerosol scheme. We evaluate against condensation nuclei (N10) and cloud condensation nuclei (CCN) from seven ship campaigns and three terrestrial locations, spanning the years 2015–2019. We find that N10 is heavily underestimated in the model across all regions and seasons by more than 50 % and in some cases by over 80 % at higher latitudes. CCN is also strongly underestimated over marine and Antarctic regions, often by more than 50 %. We then perform seven sensitivity tests to explore different aerosol configurations. We find that updating the dimethyl sulfide climatology and turning on the primary marine organic aerosol flux marginally improves marine CCN by between 4–9 %. N10 however was reduced by between 3–9 %, resulting in worse model performance. The Southern Ocean radiative bias is also reduced by this combination of changes, with limited adverse effects. We also test altering the sea spray flux to use wind gust instead of mean wind speed, which significantly improved CCN in the marine regions, but resulted in detrimental impacts on the radiation budget. Our results indicate significant problems in the model’s microphysical processes and with over tuning. We suggest this needs to be addressed in a holistic way.
- Preprint
(23658 KB) - Metadata XML
- BibTeX
- EndNote
Status: final response (author comments only)
-
RC1: 'Comment on egusphere-2024-3125', Anonymous Referee #1, 14 Nov 2024
The authors test the ability of an aerosol scheme, within an atmospheric model, to simulate Southern Ocean aerosol concentrations and find it lacking in skill, compared to measurements from multiple in-situ campaigns. Implementation of several structural model changes lead to minimal increases in model skill, or regional/seasonal skill increases that are offset by skill reductions in other regions/seasons. The authors suggest a more holistic approach to model development is needed that accounts for multiple aerosol and possibly microphysical process representations simultaneously. Additionally, the authors highlight the need for more aerosol size distribution measurements with which to evaluate the effects of model developments on Southern Ocean aerosol and clouds.
Although the research is informative about potential model development priorities, and could thus make an important contribution to the literature, some aspects of how the article is written need to be addressed before publication:
Title and sub-titles:
The 2nd part of the title is not clearly supported by the results and should be changed to better reflect the paper content.
Sub-title of section 3.4.2 is incorrect. The content describes data processing rather than statistical methods.
Vague language:
Language is imprecise in parts of the article, leaving the reader to guess the author's intended meaning. The discussion section stands out as particularly vague. One key example: in the first paragraph of section 6, the phrase ‘as a whole’ is unexplained, yet this seems to be the primary recommendation of the paper. The authors need to take the time and space to frame their hypothesised model development framework in more detail and with greater clarity to be convincing.
Other examples of vagueness in language that need to be addressed include:
In section 3.1, the description of the model indicates the ACCESS model was run in atmosphere-only mode. So, the model is essentially UM10.6 GA7.1 with GLOMAP-mode, using CMIP6 and CEDS emissions, nudged towards ERA5 data. The coupled aspect of the model seems irrelvant yet ACCESS is framed as the model being evaluated. What is ACCESS actually adding to the simulations?
Line 198 to 202: This meaning of this paragraph is hard to unravel. Clearer language is needed.
Line 304: GLOMAP-mode provides these values, though they may not have been selected by the authors. This description needs to clearly state this was a choice, rather than a model deficiency.
Figure 3 caption: 25th and 75th percentiles of what?
Line 340: Standard deviation of what? Calculated from which data?
Line 463: The meaning of the final sentence is obscure and unreferenced.
Constraint/constrained is used incorrectly in the article. I think the authors mean ‘restricted to’, ‘in’, or ‘limited to’. Constraint has a specific meaning related to model uncertainty.
These is extensive use of accronyms, which may be considered appropriate for some readers, but reduces readability. Particularly, readability is reduced by using accronyms for observation stations.
Presentation of results with more obvious scientific reasonining:
Much of the manuscript needs to be rewritten to highlight key discoveries to the reader. The authors should consider where meaning is assumed and could be clarified. Additionally, the text often contains only statements about model-to-observation comparisons, without interpretation of meaning. Occassionally, statements conflict with results, which suggests they've not been considered deeply.
Some essential changes include:
Section 3.4. The first paragraph here is unneccessary. Nothing of value is added, so this should be removed.Line 335: First sentence is confusing.
Paragraph starting line 360: In this location, the NPF scheme test is the only sensitivity test to shift the model from biased low to biased high. Some interpretation of results is needed here. This result implies the persistent model bias might be partially overcome by implementing a more sophisticated NPF parametrization. Also, the simplicity of the NPF parametrization needs to be mentioned in this section to help the reader understand why the improvements are spatially restricted.
Section 4 has many long-winded descriptions that do not lead to insights or statements of how the results affect model interpretation.
Section 4.3: There is no mention of the BL NPF sensitivity test here, even though it is the only test to reduce activation ratios.
Line 563: The meaning of the first sentence is incongruous with the results.
Missing detail and context:
As mentioned above, some sections are heavy with text, whilst others lack detail and critical information.
For example, where the Humphries data set is introduced, no context is provided for why it might be better, or more useful, than previous data sets. Furthermore, some sense of the motivation for including the specific sensitivity tests chosen would be extremely useful in the first paragraph of section 3.2. Other specific examples of missing detail/context include:
Line 207: Why are time-varying DMS datasets preferable? Need to say what value is added.
Line 222: ‘underway’ needs a description
Line 317: Why isn’t the assumption made that the gridbox containing the observation would be the best comparison to make? There is no explaination for why the authors are even considering using a gridbox the the SW of the station.
Line379: The authors state ‘This is a key area of development for GLOMAP-mode’. This statement needs to be put in context. The suggested model developments are only important if the priority is a model with increased skill at simulating aerosol concentrations over remote polar regions. The authors have assumed this is the case, with an implied further assumption that aerosol in these regions are more climatically important than aerosol elsewhere.
Section 4.1: Some brief description of the overall under-prediction of CCN concentration and seasonal cycle amplitude, and what this implies should be given up front. Currently, this message is hidden amongst discussion of individual simulations.
Old model version:
The authors have evaluated the impact of structural changes to model parametrizations, using a relatively old version of the GLOMAP-mode aerosol scheme, without reference to published model changes that would affect results. Results in this article need to be discussed with reference to latter model versions and with some consideration of how recent model developments may impact results presented here.
For example, no reference is made to the inclusion and evaluation of primary marine organic aerosol in later model versions. Additionally, sea salt density has been updated, as has deposition velocities via land surface representations, both of which would affect the sensitivity test results.
Figures:
Font size in figures is sometimes too small. Additionally, thicker lines with better color contrast, or some other method, is needed to distinguish between simulations.
On line 511, the authors state they have evaluted other cloud properties, which is essential to make a complete analysis of the impact of the sensitivity tests on aerosol, clouds and aerosol-cloud interactions. Equivalent figures should be included in a supplement, so the reader can interpret the wider effects themselves.
References:
Some additional references that have been overlooked include:
Schutgens et al. (2017) doi.org/10.5194/acp-17-9761-2017 in section 3.4.1
Additional literature evaluating more sophisticated NPF schemes and the climatic effects of those schemes.
The final sentence in section 4.2
Model structural changes implemented after this model version, particularly where they may affect interpretation of results here (e.g. https://gmd.copernicus.org/articles/13/6383/2020/)
Spelling and syntax:
- Line 101: ‘(GLOMAP)’ over-used
- Line 149: ‘volcinic-sourced’
- Equation 7: Numerator should be ‘CHL’
- Line 261: Remove ‘/,’
- Line 290: SYO not defined
- Line 310: inline
- Section 3.4.2: This could easily be a single paragraph
- Line 353: missing ‘is’
- Line 605: bis
Citation: https://doi.org/10.5194/egusphere-2024-3125-RC1 -
RC2: 'Comment on egusphere-2024-3125', Anonymous Referee #2, 16 Nov 2024
General comments
This manuscript compares simulated Southern Ocean (SO) aerosols in the ACCESS-AM2 Earth System Model’s (ESM’s) atmospheric component to in situ SO aerosol observations. While ESMs’ representation of SO aerosols is highly uncertain and important for future climate prediction, historically, limited in situ observations of aerosols in the SO have made evaluating ESMs’ SO aerosol representations difficult. This work represents a substantial, novel contribution to scientific progress within the scope of ACP because it presents a first ESM-observation comparison in the SO across latitude and season. Before this manuscript can be accepted for publication, additional context needs to be provided on the significance of differences between the simulated and observed aerosol properties (described in detail in major issue 1 below).
Major issue 1: additional context needed on significance of model-observation differences
At present, the manuscript text presents a large number of percentage differences between observed and simulated SO aerosol properties. The only context for the significance of these properties comes from the figure shading around 25th-75th percentiles for the control simulation (not of the other 7 simulations) and observations. From only this information, it is very challenging for the reader to evaluate whether these percentage differences are significant and to assess if the first part of the manuscript title (“The ACCESS-AM2 climate model strongly underestimates aerosol concentration in the Southern Ocean”) is justified. I outline below a few different sources of uncertainty that are missing or only mentioned qualitatively in the present text.
- Point measurements vs grid cell means: The manuscript does not appear to address at all that it is comparing observed point (or near point when integration time is a few minutes) measurements to grid cell mean simulated measurements. This could induce large biases depending on the shape of the sub-grid scale distribution of aerosols, especially for the coastal sites. Some context on the sub-grid scale distribution could be gained from plotting distributions of measurements near in time and space and assessing skewness.
- Instrument and instrument simulator uncertainty: The manuscript is missing context on instrument uncertainty. Further, for the CCN comparison, the manuscript should provide uncertainty estimates due to interchanging a super-saturation-based threshold for the observations and a size-based threshold for the model. This translation will be uncertain due to assumptions needed to translate particle activation supersaturation to particle activation size (as is somewhat mentioned presently but not made quantitative) and due to imprecision in supersaturation control in the instruments. Ideally this uncertainty could be propagated through to the reported model results, but if this is too difficult, sensitivity tests of different radius thresholds should be reported. This latter test seems straight forward from the code in process_aer_along_ship.ipynb. Additionally, some of campaigns (e.g., MICRE) have both 0.5% supersaturation aerosol count and aerosol size distribution measurements. A comparison of the two CCN methods can be made directly on the observations and its error assessed.
- Variability of non-control model runs: The manuscript should comment in some form on the variability (25th-75th percentile) of the non-control simulations. Ideally this could be visualized in Figures 3-4 (though this might be too visually confusing) or the supplemental.
- Exhaust corrections: Some context on the magnitude of these corrections on the observed N10 and CCN at different sites/ships should be given as well as the uncertainty introduced by these corrections.
Specific comments
Abstract lines 13-14: “Our results indicate significant problems in the model’s microphysical processes and with over tuning.” Given how the manuscript is currently written, I believe this sentence should be rephrased to reflect that this is the authors’ opinion. This statement is one possible interpretation of the manuscript’s results. I see only one paragraph of the present manuscript referencing the tuning issue and only tangentially. (See Lines 606-611 for more detail.)
Abstract line 14: “We suggest this needs to be addressed in a holistic way.” I only see part of one sentence explicitly supporting this abstract statement: “… [this] points to a need to consider model development in this space as an entire system rather than individual components.” I think this should have slightly more discussion to be mentioned in the abstract.
Lines 105-106: “By performing these evaluations [of N10 and CCN], the model biases associated with aerosol-cloud-radiation interactions around the Southern Ocean and Antarctic can be better understood and the degree of uncertainty reduced.” Another sentence or two connecting why these aerosol properties specifically (especially N10) would accomplish this would make the introduction a lot stronger. (Just a suggestion.)
Section 3.2: The current model configurations tested (Table 1) all focus on aerosol sources. What about aerosol sinks? Even if it is out of scope to test these components of the aerosol scheme, too, the text should at least mention how the model represents aerosol sinks (e.g., coagulation, rain out) and discuss how these processes’ parameterization might contribute to model-observation discrepancies.
Lines 299-301: “Some evaluation of the model[‘]s meteorology has been carried out, but is not shown in this work. It was found to be satisfactory, which is in line with our expectations due to nudging.” Can this be presented in a supplemental? With such minimal description, it is not clear what is “satisfactory.”
Line 316-317: “At KCG, we have used the exact model gridbox that the station is located in, as choosing a gridbox to the south- west of the station resulted in poorer performance.” This sentence seems to suggest at other locations the exact model gridbox was not used? If this is true, this conflicts with my understanding of Section 3.4.2 for a stationary location.
Section 3.4.2: Do ships cross multiple grid cells in a day? (Is the daily average lat-lon location problematic?)
Figures 3, 4: It is unclear exactly what “the monthly and annual median concentrations of N10/CCN” in the captions means, especially when compared to the methods description. Does this mean “monthly and annual median of mean daily concentrations of N10/CCN” or “median of monthly and annual concentrations of N10/CCN”? I assume the former from the methods, but this is not clear from the text. I am not sure whether the colors used are colorblind safe.
Figures 6, 8: It would be helpful to compare the different model configurations’ TOA SW radiative flux directly to the CERES product instead of plotting line contours where the control bias changes signs. Even just adding one panel with the zonal means would be helpful to support the manuscript’s conclusions about which configurations improve the model relative to observations. I find it very difficult to get this important information out of the current plots. (Just a suggestion.)
Figure 7: Since LWP isn’t compared to an observational product and is only mentioned as a key controller of TOA SW flux (already shown), this figure and its associated text don’t seem necessary to the main manuscript text and could go in a supplement.
Lines 511-512: “We note that we have evaluated other cloud properties, and the aerosol direct effect via clear sky radiation, but for brevity will not discuss them here.” This seems unnecessary to mention if these evaluations aren’t even roughly summarized and aren’t included in a supplemental.
Lines 606-611: This paragraph seems to motivate the second half of the title and the end of the abstract. It is not clear to me that these are unique conclusions from the methods presented and analysis shown. There is some discussion of why the experiments are probing what they are probing, but their implementations still contain such large uncertainties that it is unclear if scattered improvements in the simulated-observed CCN comparison demonstrates “improvement of the physical representation” of aerosols. This is why I suggest toning down the abstract above. Since the title includes “could,” I think it is acceptably couched.
Technical comments
Title: The antecedent for “it” is not clear from the title alone. The antecedent could be either “ACCESS-AM2 climate model” or “aerosol concentrations.”
All of the manuscript’s equations are presented at the end of paragraphs and are not interwoven into the text.
Citation: https://doi.org/10.5194/egusphere-2024-3125-RC2 -
RC3: 'Comment on egusphere-2024-3125', Anonymous Referee #3, 27 Nov 2024
Review: egusphere-2024-3125
Referee overview: This manuscript evaluates N10 and CCN simulated by the ACCESS model against observational data sets from the Southern Ocean to understand model biases in the region. The study shows that, in common with other global climate and Earth system models, ACCESS underestimates CCN. The collated observational data in the study is an extremely useful resource for wider model evaluation in the community.
The study also tests several changes to aerosol processes that are particularly relevant for aerosol emission and aerosol microphysical processes in the Southern Ocean. The study finds that model bias in N10/CCN can be reduced by implementing some of these changes, but overall the model’s under-prediction of N10/CCN remains large, especially in the summer season. It was interesting to see that the changes that reduced model bias in N10/CCN did not necessarily reduce model bias in shortwave radiation at the top of atmosphere.
The manuscript addresses an important problem in global climate modelling – that of model under-prediction of outgoing SW radiation at the top of the atmosphere over the Southern Ocean - while highlighting the difficulties in trying to address it. The manuscript is well written and I recommend that it is published providing the following general and technical comments are addressed.
General comments
- Section 3.2.3
I found Eqn 5 a little difficult to make sense of. The sentences “We note that the wind speed function is used here to represent surface tension of the sea surface microlayer (surface accumulation of organics). Higher wind speeds break this layer up, resulting in fewer organics being lofted into the atmosphere.” are useful, but could some extra information be added to explain how the organic mass fraction responds to ocean chlorophyll-a and sea spray particle diameter.
- Figure 1
It is useful to see the climatologies plotted, however, it is not easy to see the differences between them. I appreciate that log scales are necessary, but would the authors consider using a more differentiated colour scale. Or perhaps using one of the data sets (maybe Kettle) as the “reference” and plotting the others as differences relative to the reference.
- Section 3.2.4
Could the authors please clarify whether the OM2 DMS simulation used the daily DMS values derived from the OM2 simulations or if a monthly mean of those daily values was used.
- Section 3.4, L298-300
Is there a reference, i.e. another piece of literature (not necessarily part of this work), where the performance of meteorology in the nudged model was assessed?
- Section 3.4.1, L319-322
“We also recognise that we have not performed a similar baseline filtering to the model (in part due to lack of radon in the model), but have applied the same baseline filtering to the model as what was developed for the observations.”
The above is not clear to me. I don’t follow what the baseline filtering is, or the role Radon is playing.
- Figures 3, 4, 5
Is there a reason for the offset horizontal lines in the subfigures? For example, in Figure 3a annual N10 is shown as horizontally offset coloured lines for the model experiments on the right hand side of the plot. Similarly in Figures d-g the lines representing seasonal mean N10 for the different model experiments are horizontally offset.
- Section 4.1, L342
I don’t think it’s clear from the available data that the model mis-represents the seasonal minima at MI in May. The winter minimum N10 at MI appears to be more variable than at kennaook and Syowa and the model does not capture this. However, due to limited number of years of observations and model grid cell to point observation comparison, I think that it is difficult to conclude that the model mis-represents a seasonal minimum at MI in May.
- Section 4.1, L345
I’d argue that the model and observations both show less variation in winter. The values (for N10 etc) are smaller in winter compared with summer for both model and observations, so the variation as a fractional or percentage might better show if the variation was really much smaller in the observations.
- Section 4.1, L349
I agree that at Syowa the model does seem to simulate the minima too late compared with the obs. However, the model minima looks to extend from Jun-Aug, while the observed minima looks to be June (rather than May as stated).
- Section 4.1, L365-367
Does ACCESS use VOC emissions ancillary files or calculate VOC emissions online? Can the authors say if there were there large VOC emissions over the first grid box? If there weren’t large land-based VOC emissions over the first grid box, this might strengthen the argument that there are issues with the marine biogenics. Although I take the author’s point that shifting the grid box did not help.
- Section 4.1, L380-384
Plots of aerosol size distribution would help diagnose how the model (re)distributes aerosol in the simulations.
- Section 4.2, L423
“…which could indeed be driven by sea spray, long range transport of aerosol”
=> Add “or” before “long range transport of aerosol”
- Section 4.2, L430-437
I agree that turning on the BL NPF has a small effect, but also worth noting that it’s the only experiment that reduces CCN in MI and Syowa.
- Section 4.2, L438-437
The contours in Figs 6-8 are quite hard to see. Could the authors consider changing the colour of these? Something like cyan might stand out more.
References
Please update Bhatti et al., 2023 to the final version of the accepted maniscript.
Other points
I agree that missing sources and mis-represented microphysics (probably aerosol and cloud) are large contributors to bias in cloud and RF over the Southern Ocean. Is there any reason to suspect loss processes (deposition) might be overestimated? Can the authors comment on how ‘missing’ marine VOC sources of VOCs (and possibly secondary organic aerosol, e.g. https://doi.org/10.1016/j.scitotenv.2021.145054) could affect clouds and climate over the Southern Ocean?
To help visualise the percentage changes in N10 etc across the model experiments the authors could consider including a matrix/table of percentage change colour coded to show an increase or decrease for the parameter (e.g. N10).
Technical comments
- Introduction, L22
Change “Aerosol affect….” to “Aerosol affects…..”
- Section 3.1.1, L141; Section 3.4.1, Line 310
A grammatical point, but I felt that “inline” should be “in line” in the text.
- Section 3.3.1, L221
Not sure what ‘underway’ means here.
Section 3.3.3, L221
Change CO2 to CO2
- Section 3.3.4, Heading and L265 (and Figure 3, 4, 5)
I’m guessing that “kennaook” does use a lower case k, but in that case the legend in Fig 2 is the odd one out.
- Figure 3, 4, captions
“The monthly and annual median concentrations of N10 for at…”
=> Delete “for” or “at”
- Fig 3-5, Captions L2:
“For all, the 25th and 75th percentiles…”
Suggest adding “For all subfigures…” for clarity.
- Section 4.1, L360
I suggest a new sub-section here to report model simulations.
- Section 4.1
“For DJF, the BL NPF simulation is now overestimates the observations by 33%.”
=> Please correct this.
- Section 5, L518
Three best simulations rather than 4?
- Section 5, Figure 7
Units on colour bar are g m-2. Units in caption are kg m-2.
- Section 7, L639
Please change earth => Earth
Citation: https://doi.org/10.5194/egusphere-2024-3125-RC3
Data sets
ACCESS-AM2 ship track data Sonya Fiddes https://doi.org/10.5281/zenodo.13864183
Interactive computing environment
ACCESS_aerosol_eval Sonya Fiddes https://github.com/sfiddes/ACCESS_aerosol_eval
Viewed
HTML | XML | Total | BibTeX | EndNote | |
---|---|---|---|---|---|
261 | 114 | 14 | 389 | 4 | 3 |
- HTML: 261
- PDF: 114
- XML: 14
- Total: 389
- BibTeX: 4
- EndNote: 3
Viewed (geographical distribution)
Country | # | Views | % |
---|
Total: | 0 |
HTML: | 0 |
PDF: | 0 |
XML: | 0 |
- 1