Rough Grouping Enhances YOLO’s Pollinator Classification and Detection from Small Datasets DOI Creative Commons
Suyeon Kim

bioRxiv (Cold Spring Harbor Laboratory), Journal Year: 2024, Volume and Issue: unknown

Published: Aug. 26, 2024

ABSTRACT This study addresses the challenges in pollinator monitoring by proposing an effective data structure for automated systems, with a focus on use of machine learning to handle underrepresented groups small datasets. By experimenting grouping top three pollinators (bee, butterfly, hoverfly) and non-pollinators datasets fewer than 300 samples, research aims enhance classification detection accuracy. During 4-hour filming sessions, 181 images insects larger 1 cm were captured classified into methods: “Pollinator/ Non-pollinator”, “Bee/Butterfly/Hoverfly/Ant”, “Bumblebee/Honeybee/Butterfly/Hoverfly/ Ant”. YOLO V8 models trained, validated, tested these based different class methods. The found that “Pollinator/Non-pollinator” YOLOv8 model performed best across all metrics, suggesting it is more reliable categorizing detecting target objects, especially smaller, imbalanced finding aligns trend providing training opportunities individual classes improves Therefore, using broader categorization methods can reliability accuracy systems when insufficient.

Language: Английский

Hierarchical classification of insects with multitask learning and anomaly detection DOI Creative Commons
Kim Bjerge, Quentin Geissmann, Jamie Alison

et al.

Ecological Informatics, Journal Year: 2023, Volume and Issue: 77, P. 102278 - 102278

Published: Aug. 28, 2023

Cameras and computer vision are revolutionising the study of insects, creating new research opportunities within agriculture, epidemiology, evolution, ecology monitoring biodiversity. However, diversity insects close resemblances many species a major challenge for image-based species-level classification. Here, we present an algorithm to hierarchically classify from images, leveraging simple taxonomy (1) specimens across multiple taxonomic ranks simultaneously, (2) identify lowest rank at which reliable classification can be reached. Specifically, propose multitask learning, loss function incorporating class dependency each rank, anomaly detection based on outlier analysis quantify uncertainty. First, compile dataset 41,731 images combining time-lapse floral scenes with Global Biodiversity Information Facility (GBIF). Second, adapt state-of-the-art convolutional neural networks, ResNet EfficientNet, hierarchical belonging three orders, five families nine species. Third, assess model generalization 11 unseen by trained models. is used predict higher were not in training set. We found that into our increased accuracy ranks. As expected, correctly classified insect ranks, while was uncertain lower Anomaly effectively flag novel taxa visually distinct data. consistently mistaken similar Above all, have demonstrated practical approach uncertainty during automated situ live insects. Our method versatile, forming valuable step towards high-level

Language: Английский

Citations

27

Insect detect: An open-source DIY camera trap for automated insect monitoring DOI Creative Commons
Maximilian Sittinger, Johannes Uhler, M. A. Pink

et al.

PLoS ONE, Journal Year: 2024, Volume and Issue: 19(4), P. e0295474 - e0295474

Published: April 3, 2024

Insect monitoring is essential to design effective conservation strategies, which are indispensable mitigate worldwide declines and biodiversity loss. For this purpose, traditional methods widely established can provide data with a high taxonomic resolution. However, processing of captured insect samples often time-consuming expensive, limits the number potential replicates. Automated facilitate collection at higher spatiotemporal resolution comparatively lower effort cost. Here, we present Detect DIY (do-it-yourself) camera trap for non-invasive automated flower-visiting insects, based on low-cost off-the-shelf hardware components combined open-source software. Custom trained deep learning models detect track insects landing an artificial flower platform in real time on-device subsequently classify cropped detections local computer. Field deployment solar-powered confirmed its resistance temperatures humidity, enables autonomous during whole season. On-device detection tracking estimate activity/abundance after metadata post-processing. Our classification model achieved top-1 accuracy test dataset generalized well real-world images. The software highly customizable be adapted different use cases. With custom models, as accessible programming, many possible applications surpassing our proposed method realized.

Language: Английский

Citations

12

Towards a standardized framework for AI-assisted, image-based monitoring of nocturnal insects DOI Creative Commons
David B. Roy, Jamie Alison, Tom August

et al.

Philosophical Transactions of the Royal Society B Biological Sciences, Journal Year: 2024, Volume and Issue: 379(1904)

Published: May 5, 2024

Automated sensors have potential to standardize and expand the monitoring of insects across globe. As one most scalable fastest developing sensor technologies, we describe a framework for automated, image-based nocturnal insects—from development field deployment workflows data processing publishing. Sensors comprise light attract insects, camera collecting images computer scheduling, storage processing. Metadata is important sampling schedules that balance capture relevant ecological information against power limitations. Large volumes from automated systems necessitate effective We vision approaches detection, tracking classification including models built existing aggregations labelled insect images. Data account inherent biases. advocate explicitly correct bias in species occurrence or abundance estimates resulting imperfect detection individuals present during occasions. propose ten priorities towards step-change vital task face rapid biodiversity loss global threats. This article part theme issue ‘Towards toolkit monitoring’.

Language: Английский

Citations

12

Utilising affordable smartphones and open-source time-lapse photography for pollinator image collection and annotation DOI Creative Commons
Valentin Ştefan,

Aspen Workman,

Jared C. Cobain

et al.

Journal of Pollination Ecology, Journal Year: 2025, Volume and Issue: 37, P. 1 - 21

Published: Jan. 10, 2025

Monitoring plant-pollinator interactions is crucial for understanding the factors influencing these relationships across space and time. Traditional methods in pollination ecology are resource-intensive, while time-lapse photography offers potential non-destructive automated complementary techniques. However, accurate identification of pollinators at finer taxonomic levels (i.e., genus or species) requires high enough image quality. This study assessed feasibility using a smartphone setup to capture images arthropods visiting flowers evaluated whether offered sufficient resolution arthropod by taxonomists. Smartphones were positioned above target from various plant species urban green areas around Leipzig Halle, Germany. We present proportions identifications (instances) different (order, family, genus, based on visible features as interpreted document limitations stem (e.g., fixed positioning preventing distinguishing despite resolution) low Recommendations provided address challenges. Our results indicate that 89.81% all Hymenoptera instances identified family level, 84.56% pollinator only 25.35% level. less able identify Dipterans levels, with nearly 50% not identifiable 26.18% 15.19% levels. was due their small size more challenging needed wing veins). Advancing technology, along accessibility, affordability, user-friendliness, promising option coarse-level monitoring.

Language: Английский

Citations

2

A deep learning pipeline for time-lapse camera monitoring of insects and their floral environments DOI Creative Commons
Kim Bjerge, Henrik Karstoft, Hjalte M. R. Mann

et al.

Ecological Informatics, Journal Year: 2024, Volume and Issue: unknown, P. 102861 - 102861

Published: Oct. 1, 2024

Language: Английский

Citations

6

Detecting common coccinellids found in sorghum using deep learning models DOI Creative Commons
Chaoxin Wang, Ivan Grijalva, Doina Caragea

et al.

Scientific Reports, Journal Year: 2023, Volume and Issue: 13(1)

Published: June 16, 2023

Increased global production of sorghum has the potential to meet many demands a growing human population. Developing automation technologies for field scouting is crucial long-term and low-cost production. Since 2013, sugarcane aphid (SCA) Melanaphis sacchari (Zehntner) become an important economic pest causing significant yield loss across region in United States. Adequate management SCA depends on costly determine presence threshold levels spray insecticides. However, with impact insecticides natural enemies, there urgent need develop automated-detection their conservation. Natural enemies play role populations. These insects, primary coccinellids, prey help reduce unnecessary insecticide applications. Although these insects regulate populations, detection classification time-consuming inefficient lower value crops like during scouting. Advanced deep learning software provides means perform laborious automatic agricultural tasks, including insects. models coccinellids have not been developed. Therefore, our objective was train machine detect commonly found classify them according genera, species, subfamily level. We trained two-stage object model, specifically, Faster Region-based Convolutional Neural Network (Faster R-CNN) Feature Pyramid (FPN) also one-stage YOLO (You Only Look Once) family (YOLOv5 YOLOv7) seven (i.e., Coccinella septempunctata, Coleomegilla maculata, Cycloneda sanguinea, Harmonia axyridis, Hippodamia convergens, Olla v-nigrum, Scymninae). used images extracted from iNaturalist project training evaluation R-CNN-FPN YOLOv5 YOLOv7 models. imagery web server publish citizen's observations pertaining living organisms. Experimental using standard metrics, such as average precision (AP), [email protected], etc., shown that model performs best coccinellid [email protected] high 97.3, AP 74.6. Our research contributes automated area integrated management, making it easier sorghum.

Language: Английский

Citations

13

Precision Corn Pest Detection: Two-Step Transfer Learning for Beetles (Coleoptera) with MobileNet-SSD DOI Creative Commons
E. Maican,

Adrian IOSIF,

Sanda Maican

et al.

Agriculture, Journal Year: 2023, Volume and Issue: 13(12), P. 2287 - 2287

Published: Dec. 18, 2023

Using neural networks on low-power mobile systems can aid in controlling pests while preserving beneficial species for crops. However, devices require simplified networks, which may lead to reduced performance. This study was focused developing an optimized deep-learning model detecting corn pests. We propose a two-step transfer learning approach enhance the accuracy of two versions MobileNet SSD network. Five beetle (Coleoptera), including four harmful crops (belonging genera Anoxia, Diabrotica, Opatrum and Zabrus), one (Coccinella sp.), were selected preliminary testing. employed datasets. One first procedure comprises 2605 images with general dataset classes ‘Beetle’ ‘Ladybug’. It used recalibrate networks’ trainable parameters these broader classes. Furthermore, models retrained second 2648 five species. Performance compared baseline terms average per class mean precision (mAP). MobileNet-SSD-v2-Lite achieved mAP 0.8923, ranking but close highest (0.908) obtained by MobileNet-SSD-v1 outperforming 6.06%. demonstrated (0.9514) Diabrotica (0.8066). Anoxia it reached third-place (0.9851), top value 0.9912. Zabrus position (0.9053), Coccinella reliably distinguished from all other species, 0.8939 zero false positives; moreover, no pest mistakenly identified as Coccinella. Analyzing errors revealed good overall despite size training set, misclassification, 33 non-identifications, 7 double identifications 1 positive across 266 test yielding relative error rate 0.1579. The findings validated placed place, showing high potential using real-time control protecting

Language: Английский

Citations

11

A deep learning pipeline for time-lapse camera monitoring of floral environments and insect populations DOI Creative Commons
Kim Bjerge, Henrik Karstoft, Hjalte M. R. Mann

et al.

bioRxiv (Cold Spring Harbor Laboratory), Journal Year: 2024, Volume and Issue: unknown

Published: April 15, 2024

Abstract Arthropods, including insects, represent the most diverse group and contribute significantly to animal biomass. Automatic monitoring of insects other arthropods enables quick efficient observation management ecologically economically important targets such as pollinators, natural enemies, disease vectors, agricultural pests. The integration cameras computer vision facilitates innovative approaches for agriculture, ecology, entomology, evolution, biodiversity. However, studying their interactions with flowers vegetation in environments remains challenging, even automated camera monitoring. This paper presents a comprehensive methodology monitor abundance diversity wild quantify floral cover key resource. We apply methods across more than 10 million images recorded over two years using 48 insect traps placed three main habitat types. arthropods, visits, on specific mix Sedum plant species white, yellow red/pink colored flowers. proposed deep-learning pipeline estimates flower detects classifies arthropod taxa from time-lapse recordings. serves only an estimate correlate activity flowering plants. Color semantic segmentation DeepLabv3 are combined percent different colors. Arthropod detection incorporates motion-informed enhanced object You-Only-Look-Once (YOLO), followed by filtering stationary objects minimize double counting non-moving animals erroneous background detections. approach has been demonstrated decrease incidence false positives, since occur less 3% captured images. final step involves grouping into 19 taxonomic classes. Seven state-of-the-art models were trained validated, achieving F 1-scores ranging 0.81 0.89 classification arthropods. Among these, selected model, EfficientNetB4, achieved 80% average precision randomly samples when applied complete pipeline, which includes detection, filtering, collected 2021. As expected during beginning end season, reduced correlates noticeable drop method offers cost-effective

Language: Английский

Citations

3

Insect Identification in the Wild: The AMI Dataset DOI
A. Jain, Fagner Cunha, Michael James Bunsen

et al.

Lecture notes in computer science, Journal Year: 2024, Volume and Issue: unknown, P. 55 - 73

Published: Dec. 1, 2024

Language: Английский

Citations

3

In-field monitoring of ground-nesting insect aggregations using a scaleable multi-camera system DOI Creative Commons
Daniela Calvus, Karoline Wueppenhorst,

R.E. Schlosser

et al.

Ecological Informatics, Journal Year: 2025, Volume and Issue: unknown, P. 103004 - 103004

Published: Jan. 1, 2025

Language: Английский

Citations

0