Behavioral Coding of Captive African Elephants (Loxodonta africana): Utilizing DeepLabCut and Create ML for nocturnal activity tracking. DOI Open Access

Silje Marquardsen Lund,

Jonas B. Nielsen,

Frej Gammelgård

et al.

Published: Aug. 28, 2024

This study investigates the possibility of using machine learning models created in DeepLabCut and Create ML to automate aspects behavioral coding aid analysis. Two with different capabilities complexities were constructed compared a manually observed control period. The accuracy was assessed before being applied 7 nights footage nocturnal behavior two African elephants (Loxodonta africana). resulting data used draw conclusions regarding differences between individually nights, thus proving that such can researchers be-havioral capable tracking simple behaviors high accuracy, but had certain limitations detection complex behaviors, as stereotyped sway, displayed confusion when deciding visually similar behaviors. Further expansion may be desired create more automating coding.

Language: Английский

Computational bioacoustics with deep learning: a review and roadmap DOI Creative Commons
Dan Stowell

PeerJ, Journal Year: 2022, Volume and Issue: 10, P. e13152 - e13152

Published: March 21, 2022

Animal vocalisations and natural soundscapes are fascinating objects of study, contain valuable evidence about animal behaviours, populations ecosystems. They studied in bioacoustics ecoacoustics, with signal processing analysis an important component. Computational has accelerated recent decades due to the growth affordable digital sound recording devices, huge progress informatics such as big data, machine learning. Methods inherited from wider field deep learning, including speech image processing. However, tasks, demands data characteristics often different those addressed or music analysis. There remain unsolved problems, tasks for which is surely present many acoustic signals, but not yet realised. In this paper I perform a review state art learning computational bioacoustics, aiming clarify key concepts identify analyse knowledge gaps. Based on this, offer subjective principled roadmap learning: topics that community should aim address, order make most future developments AI informatics, use audio answering zoological ecological questions.

Language: Английский

Citations

197

DeepWild: Application of the pose estimation tool DeepLabCut for behaviour tracking in wild chimpanzees and bonobos DOI Creative Commons

Charlotte Wiltshire,

James Lewis‐Cheetham,

Viola Komedová

et al.

Journal of Animal Ecology, Journal Year: 2023, Volume and Issue: 92(8), P. 1560 - 1574

Published: May 10, 2023

Abstract Studying animal behaviour allows us to understand how different species and individuals navigate their physical social worlds. Video coding of is considered a gold standard: allowing researchers extract rich nuanced behavioural datasets, validate reliability, for research be replicated. However, in practice, videos are only useful if data can efficiently extracted. Manually locating relevant footage 10,000 s hours extremely time‐consuming, as the manual behaviour, which requires extensive training achieve reliability. Machine learning approaches used automate recognition patterns within data, considerably reducing time taken improving tracking visual information recognise challenging problem and, date, pose‐estimation tools detect typically applied where environment highly controlled. Animal interested applying these study wild animals, but it not clear what extent doing so currently possible, or most suited particular problems. To address this gap knowledge, we describe new available rapidly evolving landscape, suggest guidance tool selection, provide worked demonstration use machine track movement video apes, make our base models use. We tool, DeepLabCut, demonstrate successful two pilot an pose estimate problem: multi‐animal forest‐living chimpanzees bonobos across contexts from hand‐held footage. With DeepWild show that, without requiring specific expertise learning, estimation free‐living primates visually complex environments attainable goal researchers.

Language: Английский

Citations

29

The quest to develop automated systems for monitoring animal behavior DOI Creative Commons
Janice M. Siegford, Juan P. Steibel, Junjie Han

et al.

Applied Animal Behaviour Science, Journal Year: 2023, Volume and Issue: 265, P. 106000 - 106000

Published: July 17, 2023

Automated behavior analysis (ABA) strategies are being researched at a rapid rate to detect an array of behaviors across range species. There is growing optimism that soon ethologists will not have manually decode hours (and hours) animal videos, but instead computers process them for us. However, before we assume ABA ready practical use, it important take realistic look exactly what developed, the expertise used develop it, and context in which these studies occur. Once understand common pitfalls occurring during development identify limitations, can construct robust tools achieve automated (ultimately even continuous real time) behavioral data, allowing more detailed or longer-term on larger numbers animals than ever before. only as good trained be. A key starting point having annotated data model training assessment. most developers ethology. Often no formal ethogram developed descriptions target publications limited inaccurate. In addition, also frequently using small datasets, lack sufficient variability morphometrics, activities, camera viewpoints, environmental features be generalizable. Thus, often needs further validated satisfactorily different populations under other conditions, research purposes. Multidisciplinary teams researchers including ethicists well computer scientists, engineers needed help address problems when applying vision measure behavior. Reference datasets detection should generated shared include image annotations, baseline analyses benchmarking. Also critical standards creating such reference best practices methods validating results from ensure they At present, handful publicly available exist tools. As work realize promise subsequent precision livestock farming technologies) behavior, clear understanding practices, access accurately networking among increase our chances successes.

Language: Английский

Citations

25

Hierarchical action encoding in prefrontal cortex of freely moving macaques DOI Creative Commons
Benjamin Voloh, David J.-N. Maisson, Roberto Lopez Cervera

et al.

Cell Reports, Journal Year: 2023, Volume and Issue: 42(9), P. 113091 - 113091

Published: Aug. 31, 2023

Our natural behavioral repertoires include coordinated actions of characteristic types. To better understand how neural activity relates to the expression and action switches, we studied macaques performing a freely moving foraging task in an open environment. We developed novel analysis pipeline that can identify meaningful units behavior, corresponding recognizable such as sitting, walking, jumping, climbing. On basis transition probabilities between these actions, found behavior is organized modular hierarchical fashion. that, after regressing out many potential confounders, are associated with specific patterns firing each six prefrontal brain regions overall, encoding category progressively stronger more dorsal caudal regions. Together, results establish link selection primate on one hand neuronal other.

Language: Английский

Citations

23

KABR: In-Situ Dataset for Kenyan Animal Behavior Recognition from Drone Videos DOI

Maksim Kholiavchenko,

Jenna Kline,

Michelle Ramírez

et al.

Published: Jan. 1, 2024

We present a novel dataset for animal behavior recognition collected in-situ using video from drones flown over the Mpala Research Centre in Kenya. Videos DJI Mavic 2S January 2023 were acquired at 5.4K resolution accordance with IACUC protocols, and processed to detect track each frames. An image subregion centered on was extracted combined sequence form "mini-scene". Be-haviors then manually labeled frame of mini-scene by team annotators overseen an expert behavioral ecologist. The resulting mini-scenes our dataset, consisting more than 10 hours annotated videos reticulated gi-raffes, plains zebras, Grevy's encompassing seven types additional category occlusions. Benchmark results state-of-the-art architectures show labeling accu-racy 61.9% macro-average (per class), 86.7% micro-average instance). Our complements recent larger, diverse sets smaller, specialized ones being drones, both important considerations future an-imal research. can be accessed https://dirtmaxim.github.io/kabr.

Language: Английский

Citations

13

Elephants and algorithms: a review of the current and future role of AI in elephant monitoring DOI Creative Commons
Leandra Brickson,

Libby Zhang,

Fritz Vollrath

et al.

Journal of The Royal Society Interface, Journal Year: 2023, Volume and Issue: 20(208)

Published: Nov. 1, 2023

Artificial intelligence (AI) and machine learning (ML) present revolutionary opportunities to enhance our understanding of animal behaviour conservation strategies. Using elephants, a crucial species in Africa Asia’s protected areas, as focal point, we delve into the role AI ML their conservation. Given increasing amounts data gathered from variety sensors like cameras, microphones, geophones, drones satellites, challenge lies managing interpreting this vast data. New techniques offer solutions streamline process, helping us extract vital information that might otherwise be overlooked. This paper focuses on different AI-driven monitoring methods potential for improving elephant Collaborative efforts between experts ecological researchers are essential leveraging these innovative technologies enhanced wildlife conservation, setting precedent numerous other species.

Language: Английский

Citations

19

PanAf20K: A Large Video Dataset for Wild Ape Detection and Behaviour Recognition DOI Creative Commons
Otto Brookes, Majid Mirmehdi,

Colleen Stephens

et al.

International Journal of Computer Vision, Journal Year: 2024, Volume and Issue: 132(8), P. 3086 - 3102

Published: March 4, 2024

Abstract We present the PanAf20K dataset, largest and most diverse open-access annotated video dataset of great apes in their natural environment. It comprises more than 7 million frames across $$\sim $$ 20,000 camera trap videos chimpanzees gorillas collected at 18 field sites tropical Africa as part Pan African Programme: The Cultured Chimpanzee. footage is accompanied by a rich set annotations benchmarks making it suitable for training testing variety challenging ecologically important computer vision tasks including ape detection behaviour recognition. Furthering AI analysis information critical given International Union Conservation Nature now lists all species family either Endangered or Critically Endangered. hope can form solid basis engagement community to improve performance, efficiency, result interpretation order support assessments presence, abundance, distribution, thereby aid conservation efforts. code are available from project website:

Language: Английский

Citations

6

Automated face recognition using deep neural networks produces robust primate social networks and sociality measures DOI Creative Commons
Daniel Schofield, Gregory F. Albery, Josh A. Firth

et al.

Methods in Ecology and Evolution, Journal Year: 2023, Volume and Issue: 14(8), P. 1937 - 1951

Published: July 24, 2023

Abstract Longitudinal video archives of behaviour are crucial for examining how sociality shifts over the lifespan in wild animals. New approaches adopting computer vision technology hold serious potential to capture interactions and associations between individuals at large scale; however, such need a priori validation, as methods sampling defining edges social networks can substantially impact results. Here, we apply deep learning face recognition model generate association chimpanzees using 17 years archive from Bossou, Guinea. Using 7 million detections 100 h footage, examined varying size fixed temporal windows (i.e. aggregation rates) individual‐level gregariousness scores. The highest lowest rates produced divergent values, indicating that different patterns. To avoid any bias false positives negatives automated detection, an intermediate rate should be used reduce error across multiple variables. Individual‐level network‐derived traits were highly repeatable, strong inter‐individual variation patterns highlighting reliability method consistent time. We found no reliable effects age sex on despite significant drop population study period, individual estimates remained stable believe our framework will broad utility ethology conservation, enabling investigation animal footage scale, low cost high reproducibility. explore implications findings understanding ape populations. Furthermore, examine trade‐offs involved measures. Finally, outline steps broader deployment this analysis large‐scale datasets ecology evolution.

Language: Английский

Citations

15

Dynamic Curriculum Learning for Great Ape Detection in the Wild DOI Creative Commons
Xinyu Yang, Tilo Burghardt, Majid Mirmehdi

et al.

International Journal of Computer Vision, Journal Year: 2023, Volume and Issue: 131(5), P. 1163 - 1181

Published: Jan. 16, 2023

Abstract We propose a novel end-to-end curriculum learning approach for sparsely labelled animal datasets leveraging large volumes of unlabelled data to improve supervised species detectors. exemplify the method in detail on task finding great apes camera trap footage taken challenging real-world jungle environments. In contrast previous semi-supervised methods, our adjusts parameters dynamically over time and gradually improves detection quality by steering training towards virtuous self-reinforcement. To achieve this, we integrating pseudo-labelling with policies show how collapse can be avoided. discuss theoretical arguments, ablations, significant performance improvements against various state-of-the-art systems when evaluating Extended PanAfrican Dataset holding approx. 1.8M frames. also demonstrate outperform baselines margins sparse label versions other such as Bees Snapshot Serengeti. note that advantages are strongest smaller ratios common ecological applications. Finally, achieves competitive benchmarks generic object MS-COCO PASCAL-VOC indicating wider applicability dynamic concepts introduced. publish all relevant source code, network weights, access details full reproducibility.

Language: Английский

Citations

14

GesturalOrigins: A bottom-up framework for establishing systematic gesture data across ape species DOI Creative Commons
Charlotte Grund, Gal Badihi, Kirsty E. Graham

et al.

Behavior Research Methods, Journal Year: 2023, Volume and Issue: 56(2), P. 986 - 1001

Published: March 15, 2023

Abstract Current methodologies present significant hurdles to understanding patterns in the gestural communication of individuals, populations, and species. To address this issue, we a bottom-up data collection framework for study gesture: GesturalOrigins. By “bottom-up”, mean that minimise priori structural choices, allowing researchers define larger concepts (such as ‘gesture types’, ‘response latencies’, or sequences’) flexibly once coding is complete. Data can easily be re-organised provide replication of, comparison with, wide range datasets published planned analyses. We packages, templates, instructions complete process. illustrate flexibility our methodological tool offers with worked examples (great ape) communication, demonstrating differences duration action phases across distinct gesture types showing how species variation latency respond requests may revealed masked by choices. While GesturalOrigins built from an ape-centred perspective, basic adapted potentially other systems. making methods transparent open access, hope enable more direct findings research groups, improve collaborations, advance field tackle some long-standing questions comparative research.

Language: Английский

Citations

14