<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v3.0 20080202//EN" "https://jats.nlm.nih.gov/nlm-dtd/publishing/3.0/journalpublishing3.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="methods-article" specific-use="SMUR" dtd-version="3.0" xml:lang="en">
<front>
<journal-meta>
<journal-id journal-id-type="publisher">EGUsphere</journal-id>
<journal-title-group>
<journal-title>EGUsphere</journal-title>
<abbrev-journal-title abbrev-type="publisher">EGUsphere</abbrev-journal-title>
<abbrev-journal-title abbrev-type="nlm-ta">EGUsphere</abbrev-journal-title>
</journal-title-group>
<issn pub-type="epub"></issn>
<publisher><publisher-name>Copernicus Publications</publisher-name>
<publisher-loc>Göttingen, Germany</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.5194/egusphere-2026-2075</article-id>
<title-group>
<article-title>Technical note: a Recognition-assisted Camera for Automated Microscopy (RaCAM)</article-title>
</title-group>
<contrib-group><contrib contrib-type="author" xlink:type="simple"><name name-style="western"><surname>Tetard</surname>
<given-names>Martin</given-names>
<ext-link>https://orcid.org/0000-0003-0487-1949</ext-link>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author" xlink:type="simple"><name name-style="western"><surname>Marchant</surname>
<given-names>Ross</given-names>
<ext-link>https://orcid.org/0000-0003-2248-0378</ext-link>
</name>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
</contrib>
</contrib-group><aff id="aff1">
<label>1</label>
<addr-line>ESNZ, Lower Hutt, New Zealand</addr-line>
</aff>
<aff id="aff2">
<label>2</label>
<addr-line>InFarm, Goondiwindi, Australia</addr-line>
</aff>
<pub-date pub-type="epub">
<day>22</day>
<month>04</month>
<year>2026</year>
</pub-date>
<volume>2026</volume>
<fpage>1</fpage>
<lpage>18</lpage>
<permissions>
<copyright-statement>Copyright: &#x000a9; 2026 Martin Tetard</copyright-statement>
<copyright-year>2026</copyright-year>
<license license-type="open-access">
<license-p>This work is licensed under the Creative Commons Attribution 4.0 International License. To view a copy of this licence, visit <ext-link ext-link-type="uri"  xlink:href="https://creativecommons.org/licenses/by/4.0/">https://creativecommons.org/licenses/by/4.0/</ext-link></license-p>
</license>
</permissions>
<self-uri xlink:href="https://egusphere.copernicus.org/preprints/2026/egusphere-2026-2075/">This article is available from https://egusphere.copernicus.org/preprints/2026/egusphere-2026-2075/</self-uri>
<self-uri xlink:href="https://egusphere.copernicus.org/preprints/2026/egusphere-2026-2075/egusphere-2026-2075.pdf">The full text article is available as a PDF file from https://egusphere.copernicus.org/preprints/2026/egusphere-2026-2075/egusphere-2026-2075.pdf</self-uri>
<abstract>
<p>Automated microscopy workflows, including image acquisition, processing, and recognition using artificial intelligence (AI) are getting a growing interest from the scientific community in biogeosciences, as more and more research institutes are actively working on building datasets of images to train artificial convolutional neural networks (CNNs) to identify microscopic objects.&lt;/p&gt;
&lt;p&gt;Here, we present a new, affordable, AI-assisted, Raspberry Pi-powered camera, with the first, built-in, and fully auto-mated microscopy workflow (including automated image acquisition, processing and recognition) that can fit any microscope equipped with a regular C-mount (or CS-mount) camera thread. This camera is equipped with an integrated Single-Board Computer (Raspberry Pi 5) and high-resolution camera sensor (12.3 mp), attached together using a 3D-printable adaptor. Us-ing a new open-source software (RaCAM user interface), written using the Python language, and freely downloadable too, the camera is capable of performing automated acquisition of field of view images, segmenting each visible object of interest, and identifying them using trained CNN onnx models in a few seconds as part of a whole automated workflow.&lt;/p&gt;
&lt;p&gt;The camera is also adapted to on-field tasks such as core description, biostratigraphy or even palaeoenvironmental reconstructions based on microfossils census data or morphometry, as it can operate without the need for a spare computer and run directly on a power bank. Finally, as the RaCAM workflow relies on images directly captured by the camera, applications can also be extended outside of the microscopy and micropaleontology research fields as any picture acquired with this device can virtually be processed by the automated workflow.</p>
</abstract>
<counts><page-count count="18"/></counts>
<funding-group>
<award-group id="gs1">
<funding-source>National Science Foundation</funding-source>
<award-id>2035029</award-id>
<award-id>2034719</award-id>
<award-id>2034883</award-id>
<award-id>2034990</award-id>
<award-id>2034999</award-id>
<award-id>2035035</award-id>
<award-id>2035138</award-id>
</award-group>
<award-group id="gs2">
<funding-source>Deutsche Forschungsgemeinschaft</funding-source>
<award-id>KU 4292/1-1</award-id>
<award-id>MU 3670/3-1</award-id>
<award-id>KL 3314/4-1</award-id>
</award-group>
<award-group id="gs3">
<funding-source>Antarctica New Zealand</funding-source>
<award-id>K862A</award-id>
</award-group>
</funding-group>
</article-meta>
</front>
<body/>
<back>
</back>
</article>