the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
Short communication: Learning How Landscapes Evolve with Neural Operators
Abstract. The use of Fourier Neural Operators (FNOs) to learn how landscapes evolve is introduced. The approach makes use of recent developments in deep learning to learn the processes involved in evolving landscapes (e.g. erosion). An example is provided in which FNOs are developed using input-output pairs (elevations at different times) in synthetic landscapes generated using the stream power model (SPM). The SPM takes the form of a non-linear partial differential equation that advects slopes headwards. The results indicate that the learned operators can reliably and very rapidly predict subsequent landscape evolution at large scales. These results suggest that FNOs could be used to rapidly predict landscape evolution without recourse to the (slow) computation of flow routing and time stepping needed when generating numerical solutions to the SPM. More broadly they suggest that neural operators could be used to learn the processes that evolve actual and analogue landscapes.
- Preprint
(727 KB) - Metadata XML
- BibTeX
- EndNote
Status: final response (author comments only)
-
RC1: 'Comment on egusphere-2025-307', Christoph Glotzbach, 13 Mar 2025
Review for Roberts submitted to EGU Earth Surface Dynamics
The manuscript applies a deep learning method called Neural Operators or more specifically, Fourier Neural Operators to investigate if they can accurately predict landscape evolution in models produced with the stream power model. Results indicate that the first-order landscape form (e.g. hypsometry) can be reliable and rapid predicted, while fine-scale features (e.g. river valleys) are not predicted. Although the results are promising, more work is required to evaluate the applicability of the approach, and I would like the author to state the limitations and need for further detailed analyses in the abstract and conclusion.
Detailed comments:
Line 1-8: You may want to add the limitations at the end of the abstract, e.g. that it is not yet explored if learned operators can be applied to other parameterisations.
Line 74: Please clarify how a time step of even 1 Myr is possible, although the condition in Eq. 2 is not met.
Line 117: I guess with epoch you mean time steps. Might be good to use through the paper either time step or epoch.
Line 118: Please define ‘learning rate’ in the previous paragraph.
Line 119: Can you explain in more detail what you mean with training and testing sets.
Line 144: Since one of the starting objectives is to find a faster alternative to conventional LEM, it is quite important to state if the trained operators can be used to run other scenarios, e.g. with different boundary conditions and parameterisation. Not much is gained if training needs to be repeated for each individual scenario solved, for instance in an inverse approach. Please state your experiences
Line 148-150: This might be a good application and would save not computational time but time in the lab. Might be worth mentioning in the conclusion/abstract.
Line 159: Please mention that the computational time can only be reduced in case the produced operators are applicable/scalable to other model parameterisations.
Equations:
Eq. 3: You may want to add spatial coordinates x,y to G, Z.
Figures:
Fig. 1 Caption: Change to ‘…used to train the…’
Fig. 1 Caption: Add the used time step, I guess 1 Ma.
Citation: https://doi.org/10.5194/egusphere-2025-307-RC1 -
RC2: 'Comment on egusphere-2025-307', Anonymous Referee #2, 16 Mar 2025
This manuscript briefly illustrates how a method from the deep learning community might be used as a computationally efficient emulator for numerical solutions to fluvial landscape evolution equations. Having such an emulator, especially one that is independent of grid resolution or particular parameter values, would be a big advantage in applications where a great many model solutions are needed (for example, in studies that use optimization to infer parameter values or boundary conditions from digital elevation data).
A strength of the manuscript is that it introduces readers to a promising technique from a different field, and demonstrates that it has some potential for geomorphology. Another strength is that the author's code and related files are available online for anyone to try out.
One limitation of the manuscript is that it does not provide details of the methodology, referring readers instead to a conference paper by Li et al. (2022). That paper introduces a deep learning approach for approximating the solution of PDEs, but it is geared toward an audience versed in the relevant applied mathematics; to understand it fully would probably require a fairly major effort for most geoscientists. Given that the author has put in the work to understand and apply the method of Li et al. (2022), it would add value to the manuscript if it were to provide a more extensive and practical translation for eSurf readers: a geoscience-friendly explanation of this novel method from the deep learning world.
The manuscript demonstrates nicely that the FNO can capture the broad-wavelength pattern of terrain evolution. One shortcoming, which the manuscript acknowledges, is that the FNO loses the details of the valley networks. It would be interesting to see an interpretation of why this is the case. The manuscript notes that the Fourier operator acts as a low-pass filter. Could this be why the FNO models do not capture valley network features? It would be interesting also to know whether the FNO models can produce topography that drains (i.e., does not contain spurious internally drained basins).
One question I had in reading the manuscript was whether the FNO might have better success if it were trained not just on topography but also on contributing drainage area. The text after line 90 refers to the calculation of A, which is presumably done by a routing algorithm rather than by a traditional solution to a PDE. Bonetti et al. (2020) pointed out that the calculation of A (in the form of specific contributing area, a) can be cast the solution to the PDE:
-div (a grad(z) / |grad(z)|) = 1
Viewed from this perspective, the numerical model in Figure 1 can be thought of as solving TWO coupled PDEs: the erosion law of equation (1), and the equation that governs A (or a). These two PDEs in combination produce the time-evolving fields of z and A. Perhaps the loss of valley features reflects training the FNO models only on z and not on a? Admittedly, testing this would probably involve considerably more work and additional material in the manuscript, which might be beyond the scope. But it does seem worth considering, either for this piece or for a possible follow-up one.
The manuscript notes that 'A useful benefit of the neural operator approach is that, once the learning is done, future function spaces can be predicted very rapidly'. One minor suggestion is to explain what 'function spaces' means (though it's easy enough to guess that it refers to future values of the dependent variable z). More broadly, the Li paper argues that an advantage of the FNO approach is that it is not restricted to a particular numerical discretization or a particular set of parameter values. In other words, the FNO 'learns' the PDE itself somehow. If this is right, then one suggestion for the manuscript is to demonstrate this advantage by trying out models that use a different discretization or a different value of v from the training data.
Citation: https://doi.org/10.5194/egusphere-2025-307-RC2
Model code and software
Learning How Landscapes Evolve with Neural Operators Gareth G. Roberts https://doi.org/10.5281/zenodo.14616760
Viewed
HTML | XML | Total | BibTeX | EndNote | |
---|---|---|---|---|---|
186 | 38 | 6 | 230 | 5 | 4 |
- HTML: 186
- PDF: 38
- XML: 6
- Total: 230
- BibTeX: 5
- EndNote: 4
Viewed (geographical distribution)
Country | # | Views | % |
---|---|---|---|
United States of America | 1 | 81 | 35 |
China | 2 | 23 | 9 |
United Kingdom | 3 | 23 | 9 |
Germany | 4 | 20 | 8 |
France | 5 | 19 | 8 |
Total: | 0 |
HTML: | 0 |
PDF: | 0 |
XML: | 0 |
- 1
- 81