Preprints
https://doi.org/10.5194/egusphere-2025-662
https://doi.org/10.5194/egusphere-2025-662
24 Feb 2025
 | 24 Feb 2025

From Ground Photos to Aerial Insights: Automating Citizen Science Labeling for Tree Species Segmentation in UAV Images

Salim Soltani, Lauren E. Gillespie, Moises Exposito-Alonso, Olga Ferlian, Nico Eisenhauer, Hannes Feilhauer, and Teja Kattenborn

Abstract. Spatially accurate information on plant species is essential for various biodiversity monitoring applications like vegetation monitoring. Unoccupied Aerial Vehicle (UAV)-based remote sensing combined with supervised Convolutional Neural Networks (CNNs)-based segmentation methods has enabled accurate segmentation of plant species. However, labeling training data for supervised CNN methods in vegetation monitoring is a resource-intensive task, particularly for large-scale remote sensing datasets. This study presents an automated workflow that integrates the Segment Anything Model (SAM) with Gradient-weighted Class Activation Mapping (Grad-CAM) to generate segmentation masks for citizen science plant photographs, reducing the efforts required for manual annotation. We evaluated the workflow by using the generated masks to train CNN-based segmentation models to segment 10 broadleaf tree species in UAV images. The results demonstrate that segmentation models can be trained directly using citizen science-sourced plant photographs, automating mask generation without the need for extensive manual labeling. Despite the inherent complexity of segmenting broadleaf tree species, the model achieved an overall acceptable performance. Towards efficiently monitoring vegetation dynamics across space and time, this study highlights the potential of integrating foundation models with citizen science data and remote sensing into automated vegetation mapping workflows, providing a scalable and cost-effective solution for biodiversity monitoring.

Publisher's note: Copernicus Publications remains neutral with regard to jurisdictional claims made in the text, published maps, institutional affiliations, or any other geographical representation in this paper. While Copernicus Publications makes every effort to include appropriate place names, the final responsibility lies with the authors. Views expressed in the text are those of the authors and do not necessarily reflect the views of the publisher.
Share

Journal article(s) based on this preprint

06 Nov 2025
| Highlight paper
Automated mask generation in citizen science smartphone photos and their value for mapping plant species in drone imagery
Salim Soltani, Lauren E. Gillespie, Moises Exposito-Alonso, Olga Ferlian, Nico Eisenhauer, Hannes Feilhauer, and Teja Kattenborn
Biogeosciences, 22, 6545–6561, https://doi.org/10.5194/bg-22-6545-2025,https://doi.org/10.5194/bg-22-6545-2025, 2025
Short summary Co-editor-in-chief
Salim Soltani, Lauren E. Gillespie, Moises Exposito-Alonso, Olga Ferlian, Nico Eisenhauer, Hannes Feilhauer, and Teja Kattenborn

Interactive discussion

Status: closed

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse
  • RC1: 'Comment on egusphere-2025-662', Anonymous Referee #1, 18 Mar 2025
    • AC1: 'Response to Reviewer 1 Comments', Salim Soltani, 09 May 2025
  • RC2: 'Comment on egusphere-2025-662', Anonymous Referee #2, 19 Apr 2025
    • AC2: 'Response to Reviewer 2 Comments', Salim Soltani, 09 May 2025

Interactive discussion

Status: closed

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse
  • RC1: 'Comment on egusphere-2025-662', Anonymous Referee #1, 18 Mar 2025
    • AC1: 'Response to Reviewer 1 Comments', Salim Soltani, 09 May 2025
  • RC2: 'Comment on egusphere-2025-662', Anonymous Referee #2, 19 Apr 2025
    • AC2: 'Response to Reviewer 2 Comments', Salim Soltani, 09 May 2025

Peer review completion

AR: Author's response | RR: Referee report | ED: Editor decision | EF: Editorial file upload
ED: Reconsider after major revisions (18 May 2025) by Andrew Feldman
AR by Salim Soltani on behalf of the Authors (22 Jun 2025)  Author's response   Author's tracked changes   Manuscript 
ED: Referee Nomination & Report Request started (29 Jun 2025) by Andrew Feldman
RR by Anonymous Referee #1 (18 Aug 2025)
RR by Anonymous Referee #2 (28 Aug 2025)
ED: Publish subject to minor revisions (review by editor) (02 Sep 2025) by Andrew Feldman
AR by Salim Soltani on behalf of the Authors (05 Sep 2025)  Author's response   Author's tracked changes   Manuscript 
ED: Publish as is (18 Sep 2025) by Andrew Feldman
AR by Salim Soltani on behalf of the Authors (25 Sep 2025)

Journal article(s) based on this preprint

06 Nov 2025
| Highlight paper
Automated mask generation in citizen science smartphone photos and their value for mapping plant species in drone imagery
Salim Soltani, Lauren E. Gillespie, Moises Exposito-Alonso, Olga Ferlian, Nico Eisenhauer, Hannes Feilhauer, and Teja Kattenborn
Biogeosciences, 22, 6545–6561, https://doi.org/10.5194/bg-22-6545-2025,https://doi.org/10.5194/bg-22-6545-2025, 2025
Short summary Co-editor-in-chief
Salim Soltani, Lauren E. Gillespie, Moises Exposito-Alonso, Olga Ferlian, Nico Eisenhauer, Hannes Feilhauer, and Teja Kattenborn
Salim Soltani, Lauren E. Gillespie, Moises Exposito-Alonso, Olga Ferlian, Nico Eisenhauer, Hannes Feilhauer, and Teja Kattenborn

Viewed

Total article views: 1,038 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
854 153 31 1,038 23 40
  • HTML: 854
  • PDF: 153
  • XML: 31
  • Total: 1,038
  • BibTeX: 23
  • EndNote: 40
Views and downloads (calculated since 24 Feb 2025)
Cumulative views and downloads (calculated since 24 Feb 2025)

Viewed (geographical distribution)

Total article views: 1,070 (including HTML, PDF, and XML) Thereof 1,070 with geography defined and 0 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 
Latest update: 06 Nov 2025
Download

The requested preprint has a corresponding peer-reviewed final revised paper. You are encouraged to refer to the final revised version.

Short summary
We introduce an automated approach for generating segmentation masks for citizen science plant photos, making them applicable to computer vision models. This framework effectively transforms citizen science data into a data treasure for segmentation models for plant species identification in aerial imagery. Using automatically labeled photos, we train segmentation models for mapping tree species in drone imagery, showcasing their potential for forestry, agriculture, and biodiversity monitoring.
Share