Deep learning-based image classification of sea turtles using object detection and instance segmentation models DOI Creative Commons

Jong-Won Baek,

Jung-Il Kim, Chang‐Bae Kim

et al.

PLoS ONE, Journal Year: 2024, Volume and Issue: 19(11), P. e0313323 - e0313323

Published: Nov. 25, 2024

Sea turtles exhibit high migratory rates and occupy a broad range of habitats, which in turn makes monitoring these taxa challenging. Applying deep learning (DL) models to vast image datasets collected from citizen science programs can offer promising solutions overcome the challenge wide habitats wildlife, particularly sea turtles. Among DL models, object detection such as You Only Look Once (YOLO) series, have been extensively employed for wildlife classification. Despite their successful application this domain, detecting objects images with complex backgrounds, including underwater environments, remains significant challenge. Recently, instance segmentation developed address issue by providing more accurate classification compared traditional models. This study performance two state-of-the-art methods namely; model (YOLOv5) (YOLOv5-seg), detect classify The were iNaturalist Google then divided into 64% training, 16% validation, 20% test sets. Model during after finishing training was evaluated loss functions various indexes, respectively. Based on functions, YOLOv5-seg demonstrated lower error rate rather than classifying YOLOv5. According mean Average Precision (mAP) values, reflect precision recall, showed superior mAP0.5 mAP0.5:0.95 YOLOv5 0.885 0.795, respectively, whereas YOLOv5-seg, values 0.918 0.831, In particular, based results, improved results may help improve turtle future.

Language: Английский

Utilising affordable smartphones and open-source time-lapse photography for pollinator image collection and annotation DOI Creative Commons
Valentin Ştefan,

Aspen Workman,

Jared C. Cobain

et al.

Journal of Pollination Ecology, Journal Year: 2025, Volume and Issue: 37, P. 1 - 21

Published: Jan. 10, 2025

Monitoring plant-pollinator interactions is crucial for understanding the factors influencing these relationships across space and time. Traditional methods in pollination ecology are resource-intensive, while time-lapse photography offers potential non-destructive automated complementary techniques. However, accurate identification of pollinators at finer taxonomic levels (i.e., genus or species) requires high enough image quality. This study assessed feasibility using a smartphone setup to capture images arthropods visiting flowers evaluated whether offered sufficient resolution arthropod by taxonomists. Smartphones were positioned above target from various plant species urban green areas around Leipzig Halle, Germany. We present proportions identifications (instances) different (order, family, genus, based on visible features as interpreted document limitations stem (e.g., fixed positioning preventing distinguishing despite resolution) low Recommendations provided address challenges. Our results indicate that 89.81% all Hymenoptera instances identified family level, 84.56% pollinator only 25.35% level. less able identify Dipterans levels, with nearly 50% not identifiable 26.18% 15.19% levels. was due their small size more challenging needed wing veins). Advancing technology, along accessibility, affordability, user-friendliness, promising option coarse-level monitoring.

Language: Английский

Citations

2

Insect detect: An open-source DIY camera trap for automated insect monitoring DOI Creative Commons
Maximilian Sittinger, Johannes Uhler, M. A. Pink

et al.

PLoS ONE, Journal Year: 2024, Volume and Issue: 19(4), P. e0295474 - e0295474

Published: April 3, 2024

Insect monitoring is essential to design effective conservation strategies, which are indispensable mitigate worldwide declines and biodiversity loss. For this purpose, traditional methods widely established can provide data with a high taxonomic resolution. However, processing of captured insect samples often time-consuming expensive, limits the number potential replicates. Automated facilitate collection at higher spatiotemporal resolution comparatively lower effort cost. Here, we present Detect DIY (do-it-yourself) camera trap for non-invasive automated flower-visiting insects, based on low-cost off-the-shelf hardware components combined open-source software. Custom trained deep learning models detect track insects landing an artificial flower platform in real time on-device subsequently classify cropped detections local computer. Field deployment solar-powered confirmed its resistance temperatures humidity, enables autonomous during whole season. On-device detection tracking estimate activity/abundance after metadata post-processing. Our classification model achieved top-1 accuracy test dataset generalized well real-world images. The software highly customizable be adapted different use cases. With custom models, as accessible programming, many possible applications surpassing our proposed method realized.

Language: Английский

Citations

12

Emerging technologies for pollinator monitoring DOI
Toke T. Høye, Matteo Montagna, Bas Oteman

et al.

Current Opinion in Insect Science, Journal Year: 2025, Volume and Issue: unknown, P. 101367 - 101367

Published: March 1, 2025

Language: Английский

Citations

1

A deep learning pipeline for time-lapse camera monitoring of insects and their floral environments DOI Creative Commons
Kim Bjerge, Henrik Karstoft, Hjalte M. R. Mann

et al.

Ecological Informatics, Journal Year: 2024, Volume and Issue: unknown, P. 102861 - 102861

Published: Oct. 1, 2024

Language: Английский

Citations

5

Efficient wildlife monitoring: Deep learning-based detection and counting of green turtles in coastal areas DOI Creative Commons

Naoya Noguchi,

Hideaki Nishizawa,

Taro Shimizu

et al.

Ecological Informatics, Journal Year: 2025, Volume and Issue: unknown, P. 103009 - 103009

Published: Jan. 1, 2025

Language: Английский

Citations

0

In-field monitoring of ground-nesting insect aggregations using a scaleable multi-camera system DOI Creative Commons
Daniela Calvus, Karoline Wueppenhorst,

R.E. Schlosser

et al.

Ecological Informatics, Journal Year: 2025, Volume and Issue: unknown, P. 103004 - 103004

Published: Jan. 1, 2025

Language: Английский

Citations

0

Hierarchical image classification using transfer learning to improve deep learning model performance for amazon parrots DOI Creative Commons
Jung-Il Kim,

Jong-Won Baek,

Chang-Bae Kim

et al.

Scientific Reports, Journal Year: 2025, Volume and Issue: 15(1)

Published: Jan. 30, 2025

Abstract Numerous studies have proven the potential of deep learning models for classifying wildlife. Such can reduce workload experts by automating species classification to monitor wild populations and global trade. Although typically perform better with more input data, available wildlife data are ordinarily limited, specifically rare or endangered species. Recently, citizen science programs helped accumulate valuable but such is still not enough achieve best performance compared benchmark datasets. Recent applied hierarchical a given dataset improve model accuracy. This study transfer Amazon parrot Specifically, hierarchy was built based on diagnostic morphological features. Upon evaluating performance, outperformed non-hierarchical in detecting parrots. Notably, achieved mean Average Precision (mAP) 0.944, surpassing mAP 0.908 model. Moreover, improved accuracy between morphologically similar The outcomes this may facilitate monitoring trade parrots conservation purposes.

Language: Английский

Citations

0

Applying Fourier Neural Operator to insect wingbeat sound classification: Introducing CF-ResNet-1D DOI Creative Commons
Béla J. Szekeres, Natabara Máté Gyöngyössy, János Botzheim

et al.

Ecological Informatics, Journal Year: 2025, Volume and Issue: 86, P. 103055 - 103055

Published: Feb. 3, 2025

Language: Английский

Citations

0

Semi-Supervised Hierarchical Multi-Label Classifier Based on Local Information DOI

Jonathan Serrano-Pérez,

L. Enrique Sucar

International Journal of Approximate Reasoning, Journal Year: 2025, Volume and Issue: unknown, P. 109411 - 109411

Published: March 1, 2025

Language: Английский

Citations

0

A deep learning pipeline for time-lapse camera monitoring of floral environments and insect populations DOI Creative Commons
Kim Bjerge, Henrik Karstoft, Hjalte M. R. Mann

et al.

bioRxiv (Cold Spring Harbor Laboratory), Journal Year: 2024, Volume and Issue: unknown

Published: April 15, 2024

Abstract Arthropods, including insects, represent the most diverse group and contribute significantly to animal biomass. Automatic monitoring of insects other arthropods enables quick efficient observation management ecologically economically important targets such as pollinators, natural enemies, disease vectors, agricultural pests. The integration cameras computer vision facilitates innovative approaches for agriculture, ecology, entomology, evolution, biodiversity. However, studying their interactions with flowers vegetation in environments remains challenging, even automated camera monitoring. This paper presents a comprehensive methodology monitor abundance diversity wild quantify floral cover key resource. We apply methods across more than 10 million images recorded over two years using 48 insect traps placed three main habitat types. arthropods, visits, on specific mix Sedum plant species white, yellow red/pink colored flowers. proposed deep-learning pipeline estimates flower detects classifies arthropod taxa from time-lapse recordings. serves only an estimate correlate activity flowering plants. Color semantic segmentation DeepLabv3 are combined percent different colors. Arthropod detection incorporates motion-informed enhanced object You-Only-Look-Once (YOLO), followed by filtering stationary objects minimize double counting non-moving animals erroneous background detections. approach has been demonstrated decrease incidence false positives, since occur less 3% captured images. final step involves grouping into 19 taxonomic classes. Seven state-of-the-art models were trained validated, achieving F 1-scores ranging 0.81 0.89 classification arthropods. Among these, selected model, EfficientNetB4, achieved 80% average precision randomly samples when applied complete pipeline, which includes detection, filtering, collected 2021. As expected during beginning end season, reduced correlates noticeable drop method offers cost-effective

Language: Английский

Citations

3