Preprints
https://doi.org/10.5194/egusphere-2026-2075
https://doi.org/10.5194/egusphere-2026-2075
22 Apr 2026
 | 22 Apr 2026
Status: this preprint is open for discussion and under review for Biogeosciences (BG).

Technical note: a Recognition-assisted Camera for Automated Microscopy (RaCAM)

Martin Tetard and Ross Marchant

Abstract. Automated microscopy workflows, including image acquisition, processing, and recognition using artificial intelligence (AI) are getting a growing interest from the scientific community in biogeosciences, as more and more research institutes are actively working on building datasets of images to train artificial convolutional neural networks (CNNs) to identify microscopic objects.

Here, we present a new, affordable, AI-assisted, Raspberry Pi-powered camera, with the first, built-in, and fully auto-mated microscopy workflow (including automated image acquisition, processing and recognition) that can fit any microscope equipped with a regular C-mount (or CS-mount) camera thread. This camera is equipped with an integrated Single-Board Computer (Raspberry Pi 5) and high-resolution camera sensor (12.3 mp), attached together using a 3D-printable adaptor. Us-ing a new open-source software (RaCAM user interface), written using the Python language, and freely downloadable too, the camera is capable of performing automated acquisition of field of view images, segmenting each visible object of interest, and identifying them using trained CNN onnx models in a few seconds as part of a whole automated workflow.

The camera is also adapted to on-field tasks such as core description, biostratigraphy or even palaeoenvironmental reconstructions based on microfossils census data or morphometry, as it can operate without the need for a spare computer and run directly on a power bank. Finally, as the RaCAM workflow relies on images directly captured by the camera, applications can also be extended outside of the microscopy and micropaleontology research fields as any picture acquired with this device can virtually be processed by the automated workflow.

Publisher's note: Copernicus Publications remains neutral with regard to jurisdictional claims made in the text, published maps, institutional affiliations, or any other geographical representation in this paper. While Copernicus Publications makes every effort to include appropriate place names, the final responsibility lies with the authors. Views expressed in the text are those of the authors and do not necessarily reflect the views of the publisher.
Share
Martin Tetard and Ross Marchant

Status: open (until 11 Jun 2026)

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse
Martin Tetard and Ross Marchant
Martin Tetard and Ross Marchant

Viewed

Total article views: 217 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
162 39 16 217 14 14
  • HTML: 162
  • PDF: 39
  • XML: 16
  • Total: 217
  • BibTeX: 14
  • EndNote: 14
Views and downloads (calculated since 22 Apr 2026)
Cumulative views and downloads (calculated since 22 Apr 2026)

Viewed (geographical distribution)

Total article views: 217 (including HTML, PDF, and XML) Thereof 217 with geography defined and 0 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 
Latest update: 13 May 2026
Download
Short summary
We designed an AI-assisted, and affordable camera, that can be screwed on top of most of microscopes (or used as a regular camera) and allow for an automated workflow including image capture, processing and identification of detected objects using artificial neural network to be performed. As this camera runs using a micro-computer and can be powered on a regular power bank, it is ideal to perform microscopy and computer tasks directly on field without the need for an expert to be deployed.
Share