Deep dive into KABR: a dataset for understanding ungulate behavior from in-situ drone video DOI
Maksym Kholiavchenko,

Jenna Kline,

Maksim V. Kukushkin

et al.

Multimedia Tools and Applications, Journal Year: 2024, Volume and Issue: unknown

Published: Dec. 21, 2024

Language: Английский

YOLO‐Behaviour: A simple, flexible framework to automatically quantify animal behaviours from videos DOI Creative Commons
Alex Hoi Hang Chan, Prasetia Utama Putra, Harald T. Schupp

et al.

Methods in Ecology and Evolution, Journal Year: 2025, Volume and Issue: unknown

Published: Feb. 12, 2025

Abstract Manually coding behaviours from videos is essential to study animal behaviour but it labour‐intensive and susceptible inter‐rater bias reliability issues. Recent developments of computer vision tools enable the automatic quantification behaviours, supplementing or even replacing manual annotation. However, widespread adoption these methods still limited, due lack annotated training datasets domain‐specific knowledge required optimize models for research. Here, we present YOLO‐Behaviour, a flexible framework identifying visually distinct video recordings. The robust, easy implement, requires minimal annotations as data. We demonstrate flexibility with case studies event‐wise detection in house sparrow nestling provisioning, Siberian jay feeding, human eating frame‐wise detections various pigeons, zebras giraffes. Our results show that reliably detects accurately retrieve comparable accuracy metrics extracted were less correlated annotation, potential reasons discrepancy between annotation are discussed. To mitigate this problem, can be used hybrid approach first detecting events using pipeline then manually confirming detections, saving time. provide detailed documentation guidelines on how implement YOLO‐Behaviour framework, researchers readily train deploy new their own systems. anticipate another step towards lowering barrier entry applying behaviour.

Language: Английский

Citations

1

Swin-Panda: Behavior Recognition for Giant Pandas Based on Local Fine-Grained and Spatiotemporal Displacement Features DOI Creative Commons
Xinyu Yi, Han Su, Min Peng

et al.

Diversity, Journal Year: 2025, Volume and Issue: 17(2), P. 139 - 139

Published: Feb. 19, 2025

The giant panda, a rare and iconic species endemic to China, has attracted significant attention from both domestic international researchers due its crucial ecological role, unique cultural value, distinct evolutionary history. While substantial progress been made in the field of individual identification, behavior recognition remains underdeveloped, facing challenges such as lack dynamic temporal features insufficient extraction behavioral characteristics. To address these challenges, we propose Swin-Panda model, which leverages transfer learning based on Video Swin Transformer architecture within mmaction2 framework. In addition, introduce two novel modules: Comprehensive Perception Auxiliary Module Spatiotemporal Shift Attention Module. These modules facilitate local spatiotemporal information, allowing model more effectively capture movement patterns pandas. Experimental results PACV-8 dataset demonstrate that our achieves an accuracy 88.02%, outperforming several benchmark models. This approach significantly enhances accuracy, thereby contributing advancement panda welfare conservation efforts.

Language: Английский

Citations

0

Computer vision for primate behavior analysis in the wild DOI
Richard Vogg, Timo Lüddecke, Jonathan Henrich

et al.

Nature Methods, Journal Year: 2025, Volume and Issue: unknown

Published: April 10, 2025

Language: Английский

Citations

0

Peering into the world of wild passerines with 3D-SOCS: synchronized video capture and posture estimation DOI Creative Commons
Michael Chimento, Alex Hoi Hang Chan, Lucy M. Aplin

et al.

bioRxiv (Cold Spring Harbor Laboratory), Journal Year: 2024, Volume and Issue: unknown

Published: July 2, 2024

1 Abstract Collection of large behavioral data-sets on wild animals in natural habitats is vital ecology and evolution studies. Recent progress machine learning computer vision, combined with inexpensive microcomputers, have unlocked a new frontier fine-scale markerless measurements. Here, we leverage these advancements to develop 3D Synchronized Outdoor Camera System (3D-SOCS): an inexpensive, mobile automated method for collecting data using synchronized video frames from Raspberry Pi controlled cameras. Accuracy tests demonstrate 3D-SOCS’ tracking can estimate postures 3mm tolerance. To illustrate its research potential, place 3D-SOCS the field conduct stimulus presentation experiment. We trajectories multiple individuals different bird species, use this characterize visual configuration great tits ( Parus major ), model species ecology. find their optic axes at approximately ± 60 ◦ azimuth − 5 elevation. Furthermore, birds exhibit functional lateralization right eye conspecific stimulus, show individual differences lateralization. also that birds’ convex hulls predicts body weight, highlighting potential non-invasive population monitoring. first-of-its-kind camera system research, presenting exciting measure fine-scaled behavior morphology birds.

Language: Английский

Citations

3

YOLO-Behaviour: A simple, flexible framework to automatically quantify animal behaviours from videos DOI Creative Commons
Alex Hoi Hang Chan, Prasetia Utama Putra, Harald T. Schupp

et al.

bioRxiv (Cold Spring Harbor Laboratory), Journal Year: 2024, Volume and Issue: unknown

Published: Aug. 27, 2024

Abstract Manually coding behaviours from videos is essential to study animal behaviour but it labour-intensive and susceptible inter-rater bias reliability issues. Recent developments of computer vision tools enable the automatic quantification behaviours, supplementing or even replacing manual annotations. However, widespread adoption these methods still limited, due lack annotated training datasets domain-specific knowledge required optimize models for research. Here, we present YOLO-Behaviour, a flexible framework identifying visually distinct video recordings. The robust, easy implement, requires minimal annotations as data. We demonstrate flexibility with case studies event-wise detection in house sparrow nestling provisioning, Siberian jay feeding, human eating frame-wise detections various pigeons, zebras, giraffes. Our results show that reliably detects accurately, retrieve comparable accuracy metrics annotation. extracted were less correlated annotation, potential reasons discrepancy between annotation are discussed. To mitigate this problem, can be used hybrid approach first detecting events using pipeline then manually confirming detections, saving time. provide detailed documentation guidelines on how implement YOLO-Behaviour framework, researchers readily train deploy new their own systems. anticipate another step towards lowering barrier entry applying behaviour.

Language: Английский

Citations

1

Deep dive into KABR: a dataset for understanding ungulate behavior from in-situ drone video DOI
Maksym Kholiavchenko,

Jenna Kline,

Maksim V. Kukushkin

et al.

Multimedia Tools and Applications, Journal Year: 2024, Volume and Issue: unknown

Published: Dec. 21, 2024

Language: Английский

Citations

0