Discrimination of Hover Fly Species and Sexes by Wing Interference Signals DOI Creative Commons
Meng Li, Anna Runemark,

Julio Hernandez

и другие.

Advanced Science, Год журнала: 2023, Номер 10(34)

Опубликована: Окт. 17, 2023

Abstract Remote automated surveillance of insect abundance and diversity is poised to revolutionize decline studies. The study reveals spectral analysis thin‐film wing interference signals (WISs) can discriminate free‐flying insects beyond what be accomplished by machine vision. Detectable photonic sensors, WISs are robust indicators enabling species sex identification. first quantitative survey thickness modulation through shortwave‐infrared hyperspectral imaging 600 wings from 30 hover fly presented. Fringy reflectance WIS explained four optical parameters, including membrane thickness. Using a Naïve Bayes Classifier with five parameters that retrieved remotely, 91% achieved accuracy in identification sexes. WIS‐based therefore potent tool for remote surveillance.

Язык: Английский

Utilising affordable smartphones and open-source time-lapse photography for pollinator image collection and annotation DOI Creative Commons
Valentin Ştefan,

Aspen Workman,

Jared C. Cobain

и другие.

Journal of Pollination Ecology, Год журнала: 2025, Номер 37, С. 1 - 21

Опубликована: Янв. 10, 2025

Monitoring plant-pollinator interactions is crucial for understanding the factors influencing these relationships across space and time. Traditional methods in pollination ecology are resource-intensive, while time-lapse photography offers potential non-destructive automated complementary techniques. However, accurate identification of pollinators at finer taxonomic levels (i.e., genus or species) requires high enough image quality. This study assessed feasibility using a smartphone setup to capture images arthropods visiting flowers evaluated whether offered sufficient resolution arthropod by taxonomists. Smartphones were positioned above target from various plant species urban green areas around Leipzig Halle, Germany. We present proportions identifications (instances) different (order, family, genus, based on visible features as interpreted document limitations stem (e.g., fixed positioning preventing distinguishing despite resolution) low Recommendations provided address challenges. Our results indicate that 89.81% all Hymenoptera instances identified family level, 84.56% pollinator only 25.35% level. less able identify Dipterans levels, with nearly 50% not identifiable 26.18% 15.19% levels. was due their small size more challenging needed wing veins). Advancing technology, along accessibility, affordability, user-friendliness, promising option coarse-level monitoring.

Язык: Английский

Процитировано

2

Hierarchical classification of insects with multitask learning and anomaly detection DOI Creative Commons
Kim Bjerge, Quentin Geissmann, Jamie Alison

и другие.

Ecological Informatics, Год журнала: 2023, Номер 77, С. 102278 - 102278

Опубликована: Авг. 28, 2023

Cameras and computer vision are revolutionising the study of insects, creating new research opportunities within agriculture, epidemiology, evolution, ecology monitoring biodiversity. However, diversity insects close resemblances many species a major challenge for image-based species-level classification. Here, we present an algorithm to hierarchically classify from images, leveraging simple taxonomy (1) specimens across multiple taxonomic ranks simultaneously, (2) identify lowest rank at which reliable classification can be reached. Specifically, propose multitask learning, loss function incorporating class dependency each rank, anomaly detection based on outlier analysis quantify uncertainty. First, compile dataset 41,731 images combining time-lapse floral scenes with Global Biodiversity Information Facility (GBIF). Second, adapt state-of-the-art convolutional neural networks, ResNet EfficientNet, hierarchical belonging three orders, five families nine species. Third, assess model generalization 11 unseen by trained models. is used predict higher were not in training set. We found that into our increased accuracy ranks. As expected, correctly classified insect ranks, while was uncertain lower Anomaly effectively flag novel taxa visually distinct data. consistently mistaken similar Above all, have demonstrated practical approach uncertainty during automated situ live insects. Our method versatile, forming valuable step towards high-level

Язык: Английский

Процитировано

27

Insect detect: An open-source DIY camera trap for automated insect monitoring DOI Creative Commons
Maximilian Sittinger, Johannes Uhler, M. A. Pink

и другие.

PLoS ONE, Год журнала: 2024, Номер 19(4), С. e0295474 - e0295474

Опубликована: Апрель 3, 2024

Insect monitoring is essential to design effective conservation strategies, which are indispensable mitigate worldwide declines and biodiversity loss. For this purpose, traditional methods widely established can provide data with a high taxonomic resolution. However, processing of captured insect samples often time-consuming expensive, limits the number potential replicates. Automated facilitate collection at higher spatiotemporal resolution comparatively lower effort cost. Here, we present Detect DIY (do-it-yourself) camera trap for non-invasive automated flower-visiting insects, based on low-cost off-the-shelf hardware components combined open-source software. Custom trained deep learning models detect track insects landing an artificial flower platform in real time on-device subsequently classify cropped detections local computer. Field deployment solar-powered confirmed its resistance temperatures humidity, enables autonomous during whole season. On-device detection tracking estimate activity/abundance after metadata post-processing. Our classification model achieved top-1 accuracy test dataset generalized well real-world images. The software highly customizable be adapted different use cases. With custom models, as accessible programming, many possible applications surpassing our proposed method realized.

Язык: Английский

Процитировано

12

Overcoming biodiversity blindness: Secondary data in primary citizen science observations DOI Creative Commons
Nadja Pernat, Susan Canavan, Marina Golivets

и другие.

Ecological Solutions and Evidence, Год журнала: 2024, Номер 5(1)

Опубликована: Янв. 1, 2024

Abstract In the face of global biodiversity crisis, collecting comprehensive data and making best use existing are becoming increasingly important to understand patterns drivers environmental biological phenomena at different scales. Here we address concept secondary data, which refers additional information unintentionally captured in species records, especially multimedia‐based citizen science reports. We argue that can provide a wealth ecologically relevant information, utilisation enhance our understanding traits interactions among individual organisms, populations dynamics general. explore possibilities offered by describe their main types sources. An overview research this field provides synthesis results already achieved using approaches extraction. Finally, discuss challenges widespread such as biases, licensing issues, metadata lack awareness trove due missing common terminology, well possible solutions overcome these barriers. Although exploration is only emerging, many opportunities identified show how enrich monitoring.

Язык: Английский

Процитировано

11

Emerging technologies for fast determination of nutritional quality and safety of insects for food and feed: A review DOI

Frank Ssemakula,

Sarah Nawoya,

Catherine Nkirote Kunyanga

и другие.

Опубликована: Янв. 1, 2025

Язык: Английский

Процитировано

1

Harnessing Artificial Intelligence, Machine Learning and Deep Learning for Sustainable Forestry Management and Conservation: Transformative Potential and Future Perspectives DOI Creative Commons

T. J. Wang,

Yiping Zuo,

Teja Manda

и другие.

Plants, Год журнала: 2025, Номер 14(7), С. 998 - 998

Опубликована: Март 22, 2025

Plants serve as the basis for ecosystems and provide a wide range of essential ecological, environmental, economic benefits. However, forest plants other systems are constantly threatened by degradation extinction, mainly due to misuse exhaustion. Therefore, sustainable management (SFM) is paramount, especially in wake global climate change challenges. SFM ensures continued provision forests both present future generations. In practice, faces challenges balancing use conservation forests. This review discusses transformative potential artificial intelligence (AI), machine learning, deep learning (DL) technologies management. It summarizes current research technological improvements implemented using AI, discussing their applications, such predictive analytics modeling techniques that enable accurate forecasting dynamics carbon sequestration, species distribution, ecosystem conditions. Additionally, it explores how AI-powered decision support facilitate adaptive strategies integrating real-time data form images or videos. The manuscript also highlights limitations incurred ML, DL combating management, providing acceptable solutions these problems. concludes perspectives immense modernizing SFM. Nonetheless, great deal has already shed much light on this topic, bridges knowledge gap.

Язык: Английский

Процитировано

1

YOLO object detection models can locate and classify broad groups of flower-visiting arthropods in images DOI Creative Commons
Thomas Stark, Valentin Ştefan, Michael Wurm

и другие.

Scientific Reports, Год журнала: 2023, Номер 13(1)

Опубликована: Сен. 29, 2023

Develoment of image recognition AI algorithms for flower-visiting arthropods has the potential to revolutionize way we monitor pollinators. Ecologists need light-weight models that can be deployed in a field setting and classify with high accuracy. We tested performance three deep learning models, YOLOv5nano, YOLOv5small, YOLOv7tiny, at object classification real time on eight groups using open-source data. These contained four orders insects are known perform majority pollination services Europe (Hymenoptera, Diptera, Coleoptera, Lepidoptera) as well other arthropod seen flowers but not typically considered pollinators (e.g., spiders-Araneae). All had accuracy, ranging from 93 97%. Intersection over union (IoU) depended relative area bounding box, performed best when single comprised large portion worst multiple small were together image. The model could accurately distinguish flies family Syrphidae Hymenoptera they mimic. results reveal capability existing YOLO contribute monitoring.

Язык: Английский

Процитировано

19

Towards global insect biomonitoring with frugal methods DOI Creative Commons
Mikkel Brydegaard, Ronniel Pedales, Vivian Feng

и другие.

Philosophical Transactions of the Royal Society B Biological Sciences, Год журнала: 2024, Номер 379(1904)

Опубликована: Май 5, 2024

None of the global targets for protecting nature are currently met, although humanity is critically dependent on biodiversity. A significant issue lack data most biodiverse regions planet where use frugal methods biomonitoring would be particularly important because available funding monitoring insufficient, especially in low-income countries. We here discuss how three approaches to insect (computer vision, lidar, DNA sequences) could made more and urge that all techniques should evaluated suitability before becoming default high-income This requires popular countries undergo a phase ‘innovation through simplification’ they implemented broadly. predict acquire raw at low cost suitable analysis with AI (e.g. images, lidar-signals) will biomonitoring, while rely heavily patented technologies may less promising sequences). conclude opinion piece by pointing out widespread require strategy providing necessary computational resources training. article part theme ‘Towards toolkit biodiversity monitoring’.

Язык: Английский

Процитировано

8

Object Detection of Small Insects in Time-Lapse Camera Recordings DOI Creative Commons
Kim Bjerge, Carsten Eie Frigaard, Henrik Karstoft

и другие.

Sensors, Год журнала: 2023, Номер 23(16), С. 7242 - 7242

Опубликована: Авг. 18, 2023

As pollinators, insects play a crucial role in ecosystem management and world food production. However, insect populations are declining, necessitating efficient monitoring methods. Existing methods analyze video or time-lapse images of nature, but analysis is challenging as small objects complex dynamic natural vegetation scenes. In this work, we provide dataset primarily honeybees visiting three different plant species during two months the summer. The consists 107,387 annotated from multiple cameras, including 9423 insects. We present method for detecting RGB images, which two-step process. Firstly, preprocessed to enhance images. This motion-informed enhancement technique uses motion colors Secondly, enhanced subsequently fed into convolutional neural network (CNN) object detector. improves on deep learning detectors You Only Look Once (YOLO) faster region-based CNN (Faster R-CNN). Using enhancement, YOLO detector average micro F1-score 0.49 0.71, Faster R-CNN 0.32 0.56. Our proposed step forward automating camera flying

Язык: Английский

Процитировано

16

A deep learning pipeline for time-lapse camera monitoring of insects and their floral environments DOI Creative Commons
Kim Bjerge, Henrik Karstoft, Hjalte M. R. Mann

и другие.

Ecological Informatics, Год журнала: 2024, Номер unknown, С. 102861 - 102861

Опубликована: Окт. 1, 2024

Язык: Английский

Процитировано

6