An inexpensive low-cost video monitoring system for automated recording of behavior and ecological interactions DOI Creative Commons
Thomas Tscheulin, Charles I. Abramson

International Journal of Comparative Psychology, Journal Year: 2023, Volume and Issue: 36(1)

Published: Sept. 25, 2023

Active, real-time observation of behavior is a time-consuming task, which heavily resource-limited. At the same time, simultaneous several individuals often paramount to increase statistical rigor and eliminate potential temporal or environmental bias, especially in natural settings. This paper describes low-cost video recording system created by using “off-the-shelf” components. The easy use can automatically record wide variety related ecological interactions evolutionary processes. sensitive enough broad range animals from planarians, small insects humans. It also be used measure plants. will work during daylight hours at night run continuously autonomously for 48 hours, longer if capture motion-triggered bigger capacity batteries data storage facilities are used.

Language: Английский

Acoustic-based Models to Assess Herd-level Calves' Emotional State: A Machine Learning Approach DOI Creative Commons
Maíra Martins da Silva, Robson Mateus Freitas Silveira, Gastão Cruz

et al.

Smart Agricultural Technology, Journal Year: 2024, Volume and Issue: unknown, P. 100682 - 100682

Published: Nov. 1, 2024

Language: Английский

Citations

1

The Convergence of AI and animal-inspired robots for ecological conservation DOI Creative Commons
Naqash Afzal, Mobeen Ur Rehman, Lakmal Seneviratne

et al.

Ecological Informatics, Journal Year: 2024, Volume and Issue: unknown, P. 102950 - 102950

Published: Dec. 1, 2024

Language: Английский

Citations

1

Unsupervised Domain Adaptation for Mitigating Sensor Variability and Interspecies Heterogeneity in Animal Activity Recognition DOI Creative Commons
Seong‐Ho Ahn, Seeun Kim, Dong‐Hwa Jeong

et al.

Animals, Journal Year: 2023, Volume and Issue: 13(20), P. 3276 - 3276

Published: Oct. 20, 2023

Animal activity recognition (AAR) using wearable sensor data has gained significant attention due to its applications in monitoring and understanding animal behavior. However, two major challenges hinder the development of robust AAR models: domain variability difficulty obtaining labeled datasets. To address this issue, study intensively investigates impact unsupervised adaptation (UDA) for AAR. We compared three distinct types UDA techniques: minimizing divergence-based, adversarial-based, reconstruction-based approaches. By leveraging UDA, classifiers enable model learn domain-invariant features, allowing trained on source perform well target without labels. evaluated effectiveness techniques dog movement additional from horses. The application across positions (neck back), sizes (middle-sized large-sized), gender (female male) within data, as species (dog horses), exhibits improvements classification performance reduced discrepancy. results highlight potential mitigate shift enhance various settings different species, providing valuable insights practical real-world scenarios where is scarce.

Language: Английский

Citations

2

Enhancing Veterinary Behavior Research: Evidence-Based Strategies for Overcoming the Limitations of Underpowered Studies DOI
Matthew O. Parker, James M. Clay

Journal of Veterinary Behavior, Journal Year: 2024, Volume and Issue: 71, P. A3 - A5

Published: Jan. 1, 2024

Language: Английский

Citations

0

Automated Prediction of Spawning Nights Using Machine Learning Analysis of Flatfish Behaviour DOI
Abdul Qadir, Neil Duncan, Wendy Ángela González-López

et al.

Published: Jan. 1, 2024

Language: Английский

Citations

0

Maximum vertical height during wing flapping of laying hens captured with a depth camera DOI Creative Commons

Tessa Grebey,

Valentina Bongiorno, Junjie Han

et al.

bioRxiv (Cold Spring Harbor Laboratory), Journal Year: 2024, Volume and Issue: unknown

Published: Oct. 14, 2024

Abstract Cage-free housing systems for laying hens, and their accompanying guidelines, legislation, audits, are becoming more common around the world. regulations often specify requirements floor space cage height, but availability of three-dimensional can vary depending on system configurations. Little research has looked at how much vertical a hen occupies while flapping her wings, which is arguably most space-intensive behavior. Therefore, objective this study was to use depth sensing camera measure maximum height hens reach when wing without physical obstructions. Twenty-eight individually caged Hy-line W36 45 weeks age were evaluated. A ceiling-mounted centered above test pen calibrated prior collecting data. During testing, one time placed in recorded wings. From footage, minimum distance between pixels obtained each frame, we computed reached by hen. Results used during event showed that 51.0 ± 4.7 cm. No measures correlated with from (P>0.05). Hens single strain, old enough have keel damage, cage-reared housed, preventing us generalizing results too far. However, cameras provide useful approach varying strains, ages, rearing/housing methods need perform dynamic behaviors.

Language: Английский

Citations

0

Tracking puppy development: automated analysis and qualitative behavioral assessment in repeated open field tests DOI
Mustafa KOÇKAYA, Sevim Isparta, Patrick R. Reinhardt

et al.

Veterinary Research Communications, Journal Year: 2024, Volume and Issue: 49(1)

Published: Nov. 25, 2024

Language: Английский

Citations

0

Voice Analysis in Dogs with Deep Learning: Development of a Fully Automatic Voice Analysis System for Bioacoustics Studies DOI Creative Commons

Mahmut Karaaslan,

Bahaeddin Türkoğlu, Ersin Kaya

et al.

Sensors, Journal Year: 2024, Volume and Issue: 24(24), P. 7978 - 7978

Published: Dec. 13, 2024

Extracting behavioral information from animal sounds has long been a focus of research in bioacoustics, as sound-derived data are crucial for understanding behavior and environmental interactions. Traditional methods, which involve manual review extensive recordings, pose significant challenges. This study proposes an automated system detecting classifying vocalizations, enhancing efficiency analysis. The uses preprocessing step to segment relevant sound regions audio followed by feature extraction using Short-Time Fourier Transform (STFT), Mel-frequency cepstral coefficients (MFCCs), linear-frequency (LFCCs). These features input into convolutional neural network (CNN) classifiers evaluate performance. Experimental results demonstrate the effectiveness different CNN models with AlexNet, DenseNet, EfficientNet, ResNet50, ResNet152 being evaluated. achieves high accuracy vocal behaviors, such barking howling dogs, providing robust tool highlights importance systems bioacoustics suggests future improvements deep learning-based methods enhanced classification

Language: Английский

Citations

0

Tag 'n' Track: Tackling the validation challenge in animal behaviour studies through automated referencing with ArUco markers DOI Creative Commons
Serge Alindekon, J. A. Deutsch, T.B. Rodenburg

et al.

Computers and Electronics in Agriculture, Journal Year: 2024, Volume and Issue: 229, P. 109812 - 109812

Published: Dec. 31, 2024

Language: Английский

Citations

0

Experiment, observation, and modeling in the lab and field DOI
Ken Yasukawa

Elsevier eBooks, Journal Year: 2024, Volume and Issue: unknown

Published: Jan. 1, 2024

Language: Английский

Citations

0