Discrimination of Hover Fly Species and Sexes by Wing Interference Signals DOI Creative Commons
Meng Li, Anna Runemark,

Julio Hernandez

et al.

Advanced Science, Journal Year: 2023, Volume and Issue: 10(34)

Published: Oct. 17, 2023

Abstract Remote automated surveillance of insect abundance and diversity is poised to revolutionize decline studies. The study reveals spectral analysis thin‐film wing interference signals (WISs) can discriminate free‐flying insects beyond what be accomplished by machine vision. Detectable photonic sensors, WISs are robust indicators enabling species sex identification. first quantitative survey thickness modulation through shortwave‐infrared hyperspectral imaging 600 wings from 30 hover fly presented. Fringy reflectance WIS explained four optical parameters, including membrane thickness. Using a Naïve Bayes Classifier with five parameters that retrieved remotely, 91% achieved accuracy in identification sexes. WIS‐based therefore potent tool for remote surveillance.

Language: Английский

Can artificial intelligence be integrated into pest monitoring schemes to help achieve sustainable agriculture? An entomological, management and computational perspective DOI Creative Commons
Daniel J. Leybourne,

Nasamu Musa,

Po Yang

et al.

Agricultural and Forest Entomology, Journal Year: 2024, Volume and Issue: unknown

Published: May 16, 2024

Abstract Recent years have seen significant advances in artificial intelligence (AI) technology. This advancement has enabled the development of decision support systems that farmers with herbivorous pest identification and monitoring. In these systems, AI supports through detection, classification quantification pests. However, many under fall short meeting demands end user, shortfalls acting as obstacles impede integration into integrated management (IPM) practices. There are four common restrict uptake AI‐driven systems. Namely: technology effectiveness, functionality field conditions, level computational expertise power required to use run system mobility. We propose criteria need meet order overcome challenges: (i) The should be based on effective efficient AI; (ii) adaptable capable handling ‘real‐world’ image data collected from field; (iii) Systems user‐friendly, device‐driven low‐cost; (iv) mobile deployable multiple weather climate conditions. likely represent innovative transformative successfully integrate IPM principles tools can farmers.

Language: Английский

Citations

5

A deep learning pipeline for time-lapse camera monitoring of insects and their floral environments DOI Creative Commons
Kim Bjerge, Henrik Karstoft, Hjalte M. R. Mann

et al.

Ecological Informatics, Journal Year: 2024, Volume and Issue: unknown, P. 102861 - 102861

Published: Oct. 1, 2024

Language: Английский

Citations

5

Using individual‐based trait frequency distributions to forecast plant‐pollinator network responses to environmental change DOI Creative Commons
Aoife Cantwell‐Jones, Jason M. Tylianakis, Keith Larson

et al.

Ecology Letters, Journal Year: 2024, Volume and Issue: 27(1)

Published: Jan. 1, 2024

Abstract Determining how and why organisms interact is fundamental to understanding ecosystem responses future environmental change. To assess the impact on plant‐pollinator interactions, recent studies have examined effects of change individual interactions accumulate generate species‐level responses. Here, we review developments in using networks interacting individuals along with their functional traits, where are nested within species nodes. We highlight these individual‐level, trait‐based connect intraspecific trait variation (as frequency distributions multiple traits) dynamic communities. This approach can better explain interaction plasticity, changes probabilities network structure over spatiotemporal or other gradients. argue that only through appreciating such plasticity accurately forecast potential vulnerability follow this general guidance collect analyse high‐resolution data, hope improving predictions for targeted effective conservation.

Language: Английский

Citations

4

A smartphone application for site-specific pest management based on deep learning and spatial interpolation DOI

Congliang Zhou,

Won Suk Lee, Shuhao Zhang

et al.

Computers and Electronics in Agriculture, Journal Year: 2024, Volume and Issue: 218, P. 108726 - 108726

Published: Feb. 16, 2024

Language: Английский

Citations

4

Deep learning for identifying bee species from images of wings and pinned specimens DOI Creative Commons
Brian J. Spiesman, Claudio Gratton, Elena M. Gratton

et al.

PLoS ONE, Journal Year: 2024, Volume and Issue: 19(5), P. e0303383 - e0303383

Published: May 28, 2024

One of the most challenging aspects bee ecology and conservation is species-level identification, which costly, time consuming, requires taxonomic expertise. Recent advances in application deep learning computer vision have shown promise for identifying large bumble ( Bombus ) species. However, bees, such as sweat bees genus Lasioglossum , are much smaller can be difficult, even trained taxonomists, to identify. For this reason, great majority poorly represented crowdsourced image datasets often used train models. But larger from B . vagans complex, difficult separate morphologically. Using images specimens our research collections, we assessed how classification models perform on these more taxa, qualitatively comparing whole pinned or forewings. The specimen wing represent 20 18 species 6 4 genera, respectively, were EfficientNetV2L convolutional neural network. Mean test precision was 94.9% 98.1% respectively. Results show that holds classifying smaller, identify datasets. Images museum collections will valuable expanding include additional species, essential scale monitoring efforts.

Language: Английский

Citations

4

Emerging technologies for fast determination of nutritional quality and safety of insects for food and feed: A review DOI

Frank Ssemakula,

Sarah Nawoya,

Catherine Nkirote Kunyanga

et al.

Published: Jan. 1, 2025

Language: Английский

Citations

0

In-field monitoring of ground-nesting insect aggregations using a scaleable multi-camera system DOI Creative Commons
Daniela Calvus, Karoline Wueppenhorst,

R.E. Schlosser

et al.

Ecological Informatics, Journal Year: 2025, Volume and Issue: unknown, P. 103004 - 103004

Published: Jan. 1, 2025

Language: Английский

Citations

0

Advanced Insect Detection Network for UAV-Based Biodiversity Monitoring DOI Creative Commons
Halimjon Khujamatov, Shakhnoza Muksimova,

Mirjamol Abdullaev

et al.

Remote Sensing, Journal Year: 2025, Volume and Issue: 17(6), P. 962 - 962

Published: March 9, 2025

The Advanced Insect Detection Network (AIDN), which represents a significant advancement in the application of deep learning for ecological monitoring, is specifically designed to enhance accuracy and efficiency insect detection from unmanned aerial vehicle (UAV) imagery. Utilizing novel architecture that incorporates advanced activation normalization techniques, multi-scale feature fusion, custom-tailored loss function, AIDN addresses unique challenges posed by small size, high mobility, diverse backgrounds insects images. In comprehensive testing against established models, demonstrated superior performance, achieving 92% precision, 88% recall, an F1-score 90%, mean Average Precision (mAP) score 89%. These results signify substantial improvement over traditional models such as YOLO v4, SSD, Faster R-CNN, typically show performance metrics approximately 10–15% lower across similar tests. practical implications AIDNs are profound, offering benefits agricultural management biodiversity conservation. By automating classification processes, reduces labor-intensive tasks manual enabling more frequent accurate data collection. This collection quality frequency enhances decision making pest conservation, leading effective interventions strategies. AIDN’s design capabilities set new standard field, promising scalable solutions UAV-based monitoring. Its ongoing development expected integrate additional sensory real-time adaptive further applicability, ensuring its role transformative tool monitoring environmental science.

Language: Английский

Citations

0

Attention mechanism‐based ultralightweight deep learning method for automated multi‐fruit disease recognition system DOI
Moshiur Rahman Tonmoy, Md. Akhtaruzzaman Adnan, Shah Murtaza Rashid Al Masud

et al.

Agronomy Journal, Journal Year: 2025, Volume and Issue: 117(2)

Published: March 1, 2025

Abstract Automated disease recognition plays a pivotal role in advancing smart artificial intelligence (AI)‐based agriculture and is crucial for achieving higher crop yields. Although substantial research has been conducted on deep learning‐based automated plant systems, these efforts have predominantly focused leaf diseases while neglecting affecting fruits. We propose an efficient architecture effective fruit with state‐of‐the‐art performance to address this gap. Our method integrates advanced techniques, such as multi‐head attention mechanisms lightweight convolutions, enhance both efficiency performance. Its ultralightweight design emphasizes minimizing computational costs, ensuring compatibility memory‐constrained edge devices, enhancing accessibility practical usability. Experimental evaluations were three diverse datasets containing multi‐class images of disease‐affected healthy samples sugar apple ( Annona squamosa ), pomegranate Punica granatum guava Psidium guajava ). proposed model attained exceptional results test set accuracies weighted precision, recall, f1‐scores exceeding 99%, which also outperformed pretrain large‐scale models. Combining high accuracy represents significant step forward developing accessible AI solutions agriculture, contributing the advancement sustainable agriculture.

Language: Английский

Citations

0

A deep learning pipeline for time-lapse camera monitoring of floral environments and insect populations DOI Creative Commons
Kim Bjerge, Henrik Karstoft, Hjalte M. R. Mann

et al.

bioRxiv (Cold Spring Harbor Laboratory), Journal Year: 2024, Volume and Issue: unknown

Published: April 15, 2024

Abstract Arthropods, including insects, represent the most diverse group and contribute significantly to animal biomass. Automatic monitoring of insects other arthropods enables quick efficient observation management ecologically economically important targets such as pollinators, natural enemies, disease vectors, agricultural pests. The integration cameras computer vision facilitates innovative approaches for agriculture, ecology, entomology, evolution, biodiversity. However, studying their interactions with flowers vegetation in environments remains challenging, even automated camera monitoring. This paper presents a comprehensive methodology monitor abundance diversity wild quantify floral cover key resource. We apply methods across more than 10 million images recorded over two years using 48 insect traps placed three main habitat types. arthropods, visits, on specific mix Sedum plant species white, yellow red/pink colored flowers. proposed deep-learning pipeline estimates flower detects classifies arthropod taxa from time-lapse recordings. serves only an estimate correlate activity flowering plants. Color semantic segmentation DeepLabv3 are combined percent different colors. Arthropod detection incorporates motion-informed enhanced object You-Only-Look-Once (YOLO), followed by filtering stationary objects minimize double counting non-moving animals erroneous background detections. approach has been demonstrated decrease incidence false positives, since occur less 3% captured images. final step involves grouping into 19 taxonomic classes. Seven state-of-the-art models were trained validated, achieving F 1-scores ranging 0.81 0.89 classification arthropods. Among these, selected model, EfficientNetB4, achieved 80% average precision randomly samples when applied complete pipeline, which includes detection, filtering, collected 2021. As expected during beginning end season, reduced correlates noticeable drop method offers cost-effective

Language: Английский

Citations

3