the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
Compressing high-resolution data through latent representation encoding for downscaling large-scale AI weather forecast model
Abstract. The rapid advancement of artificial intelligence (AI) in weather research has been driven by the ability to learn from large, high-dimensional datasets. However, this progress also poses significant challenges, particularly regarding the substantial costs associated with processing extensive data and the limitations of computational resources. Inspired by the Neural Image Compression (NIC) task in computer vision, this study seeks to compress weather data to address these challenges and enhance the efficiency of downstream applications. Specifically, we propose a variational autoencoder (VAE) framework tailored for compressing high-resolution datasets, specifically the High Resolution China Meteorological Administration Land Data Assimilation System (HRCLDAS) with a spatial resolution of 1 km. Our framework successfully reduced the storage size of 3 years of HRCLDAS data from 8.61 TB to just 204 GB, while preserving essential information. In addition, we demonstrated the utility of the compressed data through a downscaling task, where the model trained on the compressed dataset achieved accuracy comparable to that of the model trained on the original data. These results highlight the effectiveness and potential of the compressed data for future weather research.
- Preprint
(22140 KB) - Metadata XML
- BibTeX
- EndNote
Status: open (until 17 Jan 2025)
-
RC1: 'Comment on egusphere-2024-3183', Anonymous Referee #1, 15 Nov 2024
reply
This paper introduces a VAE-based data compression method for high resolution weather data and shows its potential application of training a downscaling model using the latent representation. The method in this paper is hardly novel since both VAE and UNet are commonly used in related fields. The claimed 43x compression ratio purely comes from the downsampling CNN in VAE. Usually a neural image compression method would use vector quantization and/or entropy encoding in combination with a VAE. Interestingly, none of the neural image compression methods is used as a baseline for compression. In fact, there is no baseline in the compression part. The authors are advised to use at least one established compression method as a baseline (some can be found in this repo https://interdigitalinc.github.io/CompressAI/).Â
Â
Minor points
- The HRCLDAS data is not openly available, thus not possible to reproduce the results.
- It would be nice to have a power spectrum plot for compression part (like Fig. 6).
- The evaluation only considers t2m, u10 and v10. Containing other variables especially in Table 3 would be better.
- ERA5 should be much larger than 226TB (as claimed). The pressure level data is at least 2PB and the model level data is at least 5PB.
Citation: https://doi.org/10.5194/egusphere-2024-3183-RC1 -
CEC1: 'Comment on egusphere-2024-3183 - No compliance with the policy of the journal', Juan Antonio Añel, 02 Dec 2024
reply
Dear authors,
Unfortunately, after checking your manuscript, it has come to our attention that it does not comply with our "Code and Data Policy".
https://www.geoscientific-model-development.net/policies/code_and_data_policy.htmlTo assure the replicability of your submitted work, you must publish in a permanent repository all the data that you use to train your model and the output data obtained with it. This includes in the case of your work the HRCLDAS and the FuXi-2.0 data.
I should note that given this lack of compliance with our policy, your manuscript should not have been accepted for Discussions, and therefore, Â the current situation with your manuscript is irregular. Please, publish the requested data in one of the appropriate repositories and reply to this comment with the relevant information (link and a permanent identifier for it (e.g. DOI)) as soon as possible, as we can not accept manuscripts in Discussions that do not comply with our policy.
Also, you must include the modified 'Code and Data Availability' section in a potentially reviewed manuscript, the DOIs of the new repositories.
I have to note that if you do not fix this problem as soon as possible, we will have to reject your manuscript for publication in our journal.
Juan A. Añel
Geosci. Model Dev. Executive Editor
Citation: https://doi.org/10.5194/egusphere-2024-3183-CEC1 -
AC1: 'Reply on CEC1', Bing Gong, 16 Dec 2024
reply
Dear Editor,
Thank you so much for your feedback. Now we are preparing our published data and code. We will deal with this issue as soon as possible.
Â
Citation: https://doi.org/10.5194/egusphere-2024-3183-AC1 -
AC2: 'Reply on CEC1', Bing Gong, 20 Dec 2024
reply
Dear Editor,
Due to confidentiality agreements and data privacy concerns, we are unable to publish the full original dataset. However, we have provided a subset of processed data, in compliance with ethical and legal guidelines. The exact version of the code and the data samples associated with this paper are archived on Zenodo at https://doi.org/10.5281/zenodo.14537263 (Liu et al., 2024) under an MIT license (http://opensource.org/licenses/mit-license.php, last access: 20 Dec 2024). Further guidelines to run the code, train the models, and generate the results presented in this paper are provided in the README.md file of the code repository.
We will add above  to our revised Code and Data Availability' section.
Â
Thank you so much.
Best,
Bing
Citation: https://doi.org/10.5194/egusphere-2024-3183-AC2
-
AC1: 'Reply on CEC1', Bing Gong, 16 Dec 2024
reply
Viewed
HTML | XML | Total | BibTeX | EndNote | |
---|---|---|---|---|---|
230 | 73 | 17 | 320 | 5 | 4 |
- HTML: 230
- PDF: 73
- XML: 17
- Total: 320
- BibTeX: 5
- EndNote: 4
Viewed (geographical distribution)
Country | # | Views | % |
---|
Total: | 0 |
HTML: | 0 |
PDF: | 0 |
XML: | 0 |
- 1