Abstract. The Climate Change Adaptation Digital Twin (Climate DT), developed as part of the European Commission’s Destination Earth (DestinE) initiative, is a pioneering effort to build an operational climate information system in support of adaptation. This system produces global climate simulations with local granularity, providing information at scales that matter for decision-making. The Climate DT delivers multi-decadal climate simulations at spatial resolutions of 5–10 km, with hourly outputs, offering globally consistent, frequently updated data. The km-scale simulations address some limitations of current climate models, improving local granularity and reducing longstanding biases, supporting more equitable (understood as accessible and relevant across regions) and credible climate information. The Climate DT is built on cutting-edge infrastructure, expert collaboration, and digital innovation. It supports real-time, on-demand responses to policy questions, with quantified uncertainty. It fosters interactivity by allowing users to influence simulation design, model outputs, and applications through co-design. AI-based tools, including emulators and chatbots, are being developed to enhance flexible scenario exploration and ease climate information access. Sector-specific applications are embedded in the system to generate tailored climate-impact indicators, with examples for energy, water, and forest management. The applications have been co-designed with informed users. An important innovation is the use of high-resolution storylines. These are physically consistent simulations of extreme events under different climate conditions that provide contextual insights to support concrete adaptation decisions. A unified workflow across platforms orchestrates all components ensuring automation, containerisation for portability, and traceability. The unified data management ensures consistency and eases the usability of the data. Data is delivered using standard grids (HEALPix) at high-frequency (hourly) and follows a strict governance policy. Streaming enables real-time data use and unlocks access to the unprecedented data produced by the high- resolution simulations. Monitoring tools provide real-time quality control of both data and models and provide diagnostics during the Climate DT operation. The compute-intensive system is powered by world-class supercomputing capabilities through a strategic partnership with the European High Performance Computing Joint Undertaking (EuroHPC). Despite high computational demands, the Climate DT sets a new benchmark for delivering equitable, credible, and actionable climate information. In this way it complements existing initiatives like CMIP, CORDEX, and national and European climate services, and aligns with global climate science goals for climate adaptation.
Received: 15 May 2025 – Discussion started: 13 Aug 2025
Publisher's note: Copernicus Publications remains neutral with regard to jurisdictional claims made in the text, published maps, institutional affiliations, or any other geographical representation in this paper. While Copernicus Publications makes every effort to include appropriate place names, the final responsibility lies with the authors. Views expressed in the text are those of the authors and do not necessarily reflect the views of the publisher.
This paper presents an impressive undertaking to address the significant gaps in the ability to provide information from climate models that is user-relevant and accessible. The extent of the technical challenges that have been addressed by this project to fill these needs are incredible. I see this as a framework that could potentially be expanded to S2S predictions as well. There are a few minor suggestions and that I have around making this paper more accessible to a broader audience. 1. There are a lot of acronyms, project names, and technical jargon throughout this paper. Any effort to reduce this would make the paper more accessible. 2. It comes across as if this system can do everything in the climate information space . As with anything that is trying to accommodate many users and meet a wide range of needs, I expect there are some limitations and challenges to doing this. Any effort to discuss this would be helpful. 3. I realize this paper and journal is focused around geoscience model development. I also think the user-focused approach and capabilities is such an important part of this work that it could be expanded a little more here. It seems like a complicated system. How does a user get training to setup and use the system for their specific needs and get involved in co-production efforts?
The Climate Change Adaptation Digital Twin (Climate DT) pioneers the operationalisation of climate projections. The system produces global simulations with local granularity for adaptation decision-making. Applications are embedded to generate tailored indicators. A unified workflow orchestrates all components in several supercomputers. Data management ensures consistency and streaming enables real-time use. It is a complementary innovation to initiatives like CMIP, CORDEX, and climate services.
The Climate Change Adaptation Digital Twin (Climate DT) pioneers the operationalisation of...
This paper presents an impressive undertaking to address the significant gaps in the ability to provide information from climate models that is user-relevant and accessible. The extent of the technical challenges that have been addressed by this project to fill these needs are incredible. I see this as a framework that could potentially be expanded to S2S predictions as well. There are a few minor suggestions and that I have around making this paper more accessible to a broader audience. 1. There are a lot of acronyms, project names, and technical jargon throughout this paper. Any effort to reduce this would make the paper more accessible. 2. It comes across as if this system can do everything in the climate information space . As with anything that is trying to accommodate many users and meet a wide range of needs, I expect there are some limitations and challenges to doing this. Any effort to discuss this would be helpful. 3. I realize this paper and journal is focused around geoscience model development. I also think the user-focused approach and capabilities is such an important part of this work that it could be expanded a little more here. It seems like a complicated system. How does a user get training to setup and use the system for their specific needs and get involved in co-production efforts?