AngleCam V2: Predicting leaf inclination angles across taxa from daytime and nighttime photos
Abstract. Understanding how plants capture light and maintain their energy balance is crucial for predicting how ecosystems respond to environmental changes. By monitoring leaf inclination angle distributions (LIADs), we can gain insights into plant behaviour that directly influences ecosystem functioning. LIADs affect radiative transfer processes and reflectance signals, which are essential components of satellite-based vegetation monitoring. Despite their importance, scalable methods for continuously observing these dynamics across different plant species throughout day-night cycles are limited.
We present AngleCam V2, a deep learning model that estimates LIADs from both RGB and near-infrared (NIR) night-vision imagery. We compiled a dataset of over 4,500 images across 200 globally distributed species to facilitate generalization across taxa. Moreover, we developed a method to simulate pseudo-NIR imagery from RGB imagery to enable an efficient training of a deep learning model for tracking LIADs across day and night. The model is based on a vision transformer architecture with mixed-modality training using the RGB and the synthetic NIR images.
AngleCam V2 achieved substantial improvements in generalization compared to AngleCam V1 (R² = 0.62 vs 0.12 on the same holdout dataset). Phylogenetic analysis across 100 genera revealed no systematic taxonomic bias in prediction errors. Testing against leaf angle dynamics obtained from multitemporal terrestrial laser scanning demonstrated the reliable tracking of diurnal leaf movements (R² = 0.61–0.75) and the successful detection of water limitation-induced changes over a 14-day monitoring period.
This method enables continuous monitoring of leaf angle dynamics using conventional cameras, enabling applications in ecosystem monitoring networks, plant stress detection, interpreting satellite vegetation signals, and citizen science platforms for global-scale understanding of plant structural responses.
Leaf angle distribution is an important canopy structural variable, but its temporal and spatial dynamics are much less well understood compared to LAI. The primary reason is that we don’t have a reliable method to measure it at a large spatial scale yet. This work developed AngleCam V2, which uses a deep learning model to estimate leaf inclination angle distributions from RGB and night-vision NIR imagery. By expanding training data across species, canopy structures, and lighting conditions, as well as integrating synthetic near-infrared augmentation, the developed model shows improved generalization and temporal robustness, including nighttime compatibility, which is promising for large-scale applications. The current version is well written and clearly presents the advantage of the new version of AngleCam. My only concern is the accuracy of the training datasets, in terms of visual interpretation, 20 leaf samples, representiveness of the entire canopy,
Major concerns:
Minor concerns: