An inexpensive low-cost video monitoring system for automated recording of behavior and ecological interactions DOI Creative Commons
Thomas Tscheulin, Charles I. Abramson

International Journal of Comparative Psychology, Journal Year: 2023, Volume and Issue: 36(1)

Published: Sept. 25, 2023

Active, real-time observation of behavior is a time-consuming task, which heavily resource-limited. At the same time, simultaneous several individuals often paramount to increase statistical rigor and eliminate potential temporal or environmental bias, especially in natural settings. This paper describes low-cost video recording system created by using “off-the-shelf” components. The easy use can automatically record wide variety related ecological interactions evolutionary processes. sensitive enough broad range animals from planarians, small insects humans. It also be used measure plants. will work during daylight hours at night run continuously autonomously for 48 hours, longer if capture motion-triggered bigger capacity batteries data storage facilities are used.

Language: Английский

Automated Prediction of Spawning Nights Using Machine Learning Analysis of Flatfish Behaviour DOI
Abdul Qadir, Neil Duncan, Wendy Ángela González-López

et al.

Published: Jan. 1, 2025

Senegalese sole (Solea senegalensis) broodstock exhibit distinct behaviours (Rest the Head, Guardian, Follow, and Locomotor activities) that are important for breeding success. Understanding monitoring these essential to understand successful of sole. However, manually analysing represents a significant challenge human observers is labour-intensive process. Therefore, address limitations, this study introduces custom-designed framework based on computer vision machine learning techniques. The model integrates object detection tracking mechanisms recognize monitor reproductive within aquaculture environments. By combining convolutional neural networks (CNNs) with advanced algorithms, our effectively extracts analyses behavioural patterns from video datasets. results were also compared manual identify key behaviours. automated demonstrated strong performance, accuracy, precision, specificity exceeding 87%, Pearson correlation R = 0.99 between observation data data. analysed videos accurately minimal intervention, thereby saving substantial number hours opened up possibility analyse over longer periods, generating more For first time we during entire night how specific increased decrease in relation spawning. These changes behaviour spawning enable correlate non-spawning nights provide predictive tool identifies accuracies ranging 70% 100%.In conclusion, advances capability techniques predict events facilitate informed decisions enhancing success promoting sustainable practices.

Language: Английский

Citations

0

Weight prediction method for individual live chickens based on single-view point cloud information DOI
Haikun Zheng, Chuang Ma, Dong Liu

et al.

Computers and Electronics in Agriculture, Journal Year: 2025, Volume and Issue: 234, P. 110232 - 110232

Published: March 8, 2025

Language: Английский

Citations

0

Review: Genomic selection in the era of phenotyping based on digital images DOI Creative Commons
A. H. M. Muntasir Billah, Matias Bermann, Mary Kate Hollifield

et al.

animal, Journal Year: 2025, Volume and Issue: unknown, P. 101486 - 101486

Published: March 1, 2025

Language: Английский

Citations

0

Maximum vertical height during wing flapping of laying hens captured with a depth camera DOI Creative Commons

Tessa Grebey,

Valentina Bongiorno, Junjie Han

et al.

PLoS ONE, Journal Year: 2025, Volume and Issue: 20(3), P. e0312656 - e0312656

Published: March 27, 2025

Cage-free housing systems for laying hens, and their accompanying guidelines, legislation, audits, are becoming more common around the world. regulations often specify requirements floor space cage height, but availability of three-dimensional can vary depending on system configurations. Little research has looked at how much vertical a hen occupies while flapping her wings, which is arguably most space-intensive behavior. Therefore, objective this study was to use depth sensing camera measure maximum height hens reach when wing without physical obstructions. Twenty-eight individually caged Hy-line W36 45 weeks age were evaluated. A ceiling-mounted centered above test pen calibrated prior collecting data. During testing, one time placed in recorded wings. From footage, minimum distance between pixels obtained each frame, we computed reached by hen. Results used during event showed that 51.0 ± 4.7 cm. No measures taken from correlated with (P > 0.05). Hens single strain, old enough have keel damage, cage-reared housed, preventing us generalizing results too far. However, cameras provide useful approach varying strains, ages, rearing/housing methods need perform dynamic behaviors.

Language: Английский

Citations

0

IMTFF-Networks: A deep learning model for cattle behavior classification integrating multimodal time-frequency features DOI
Xi Ni, Hou Zhen-jie, En Lin

et al.

Applied Animal Behaviour Science, Journal Year: 2025, Volume and Issue: unknown, P. 106627 - 106627

Published: April 1, 2025

Language: Английский

Citations

0

Promote computer vision applications in pig farming scenarios: high-quality dataset, fundamental models, and comparable performance1 DOI Creative Commons
Jiangong Li, Xiaodan Hu, Ana Lučić

et al.

Journal of Integrative Agriculture, Journal Year: 2024, Volume and Issue: unknown

Published: Aug. 1, 2024

Language: Английский

Citations

3

Robotics for poultry farming: Challenges and opportunities DOI Creative Commons
Uğur Özentürk, Zhengqi Chen, Lorenzo Jamone

et al.

Computers and Electronics in Agriculture, Journal Year: 2024, Volume and Issue: 226, P. 109411 - 109411

Published: Sept. 7, 2024

Language: Английский

Citations

3

Mind the Step: An Artificial Intelligence-Based Monitoring Platform for Animal Welfare DOI Creative Commons

Andrea Michielon,

Paolo Litta,

Francesca Bonelli

et al.

Sensors, Journal Year: 2024, Volume and Issue: 24(24), P. 8042 - 8042

Published: Dec. 17, 2024

We present an artificial intelligence (AI)-enhanced monitoring framework designed to assist personnel in evaluating and maintaining animal welfare using a modular architecture. This integrates multiple deep learning models automatically compute metrics relevant assessing well-being. Using for AI-based vision adapted from industrial applications human behavioral analysis, the includes modules markerless identification health status assessment (e.g., locomotion score body condition score). Methods analysis are also included evaluate how nutritional rearing conditions impact behaviors. These initially trained on public datasets then fine-tuned original data. demonstrate approach through two use cases: system dairy cattle piglet behavior system. The results indicate that scalable edge computing solutions can support precision livestock farming by automating assessments enabling timely, data-driven interventions.

Language: Английский

Citations

3

Estimating genetic parameters of digital behavior traits and their relationship with production traits in purebred pigs DOI Creative Commons
Mary Kate Hollifield, Ching-Yi Chen, Eric Psota

et al.

Genetics Selection Evolution, Journal Year: 2024, Volume and Issue: 56(1)

Published: April 16, 2024

Abstract Background With the introduction of digital phenotyping and high-throughput data, traits that were previously difficult or impossible to measure directly have become easily accessible, offering opportunity enhance efficiency rate genetic gain in animal production. It is interest assess how behavioral are indirectly related production during performance testing period. The aim this study was quality behavior data extracted from day-wise video recordings estimate parameters their phenotypic correlations with pigs. Behavior recorded for 70 days after on-test at about 10 weeks age ended off-test 2008 female purebred pigs, totaling 119,812 records. included time spent eating, drinking, laterally lying, sternally sitting, standing, meters distance traveled. A control procedure created algorithm training adjustment, standardizing recording hours, removing culled animals, filtering unrealistic Results Production average daily (ADG), back fat thickness (BF), loin depth (LD). Single-trait linear models used heritabilities two-trait between traits. results indicated all heritable, heritability estimates ranging 0.19 0.57, showed low-to-moderate Two-trait also compare different intervals To analyze redundancies period, averages various compared. Overall, 55- 68-day interval had strongest correlation Conclusions Digital a new low-cost method record phenotypes, but thorough cleaning procedures needed. Evaluating offers deeper insight into changes throughout growth periods relationship traits, which may be less frequent basis.

Language: Английский

Citations

2

Assessment of pig welfare at slaughterhouse level: A systematic review of animal-based indicators suitable for inclusion in monitoring protocols DOI Creative Commons

Nancy F Huanca-Marca,

Laura X. Estévez-Moreno,

Natyieli Losada Espinosa

et al.

Meat Science, Journal Year: 2024, Volume and Issue: 220, P. 109689 - 109689

Published: Oct. 19, 2024

Language: Английский

Citations

1