An inexpensive low-cost video monitoring system for automated recording of behavior and ecological interactions DOI Creative Commons
Thomas Tscheulin, Charles I. Abramson

International Journal of Comparative Psychology, Год журнала: 2023, Номер 36(1)

Опубликована: Сен. 25, 2023

Active, real-time observation of behavior is a time-consuming task, which heavily resource-limited. At the same time, simultaneous several individuals often paramount to increase statistical rigor and eliminate potential temporal or environmental bias, especially in natural settings. This paper describes low-cost video recording system created by using “off-the-shelf” components. The easy use can automatically record wide variety related ecological interactions evolutionary processes. sensitive enough broad range animals from planarians, small insects humans. It also be used measure plants. will work during daylight hours at night run continuously autonomously for 48 hours, longer if capture motion-triggered bigger capacity batteries data storage facilities are used.

Язык: Английский

Automated Prediction of Spawning Nights Using Machine Learning Analysis of Flatfish Behaviour DOI
Abdul Qadir, Neil Duncan, Wendy Ángela González-López

и другие.

Опубликована: Янв. 1, 2025

Senegalese sole (Solea senegalensis) broodstock exhibit distinct behaviours (Rest the Head, Guardian, Follow, and Locomotor activities) that are important for breeding success. Understanding monitoring these essential to understand successful of sole. However, manually analysing represents a significant challenge human observers is labour-intensive process. Therefore, address limitations, this study introduces custom-designed framework based on computer vision machine learning techniques. The model integrates object detection tracking mechanisms recognize monitor reproductive within aquaculture environments. By combining convolutional neural networks (CNNs) with advanced algorithms, our effectively extracts analyses behavioural patterns from video datasets. results were also compared manual identify key behaviours. automated demonstrated strong performance, accuracy, precision, specificity exceeding 87%, Pearson correlation R = 0.99 between observation data data. analysed videos accurately minimal intervention, thereby saving substantial number hours opened up possibility analyse over longer periods, generating more For first time we during entire night how specific increased decrease in relation spawning. These changes behaviour spawning enable correlate non-spawning nights provide predictive tool identifies accuracies ranging 70% 100%.In conclusion, advances capability techniques predict events facilitate informed decisions enhancing success promoting sustainable practices.

Язык: Английский

Процитировано

0

Weight prediction method for individual live chickens based on single-view point cloud information DOI
Haikun Zheng, Chuang Ma, Dong Liu

и другие.

Computers and Electronics in Agriculture, Год журнала: 2025, Номер 234, С. 110232 - 110232

Опубликована: Март 8, 2025

Язык: Английский

Процитировано

0

Review: Genomic selection in the era of phenotyping based on digital images DOI Creative Commons
A. H. M. Muntasir Billah, Matias Bermann, Mary Kate Hollifield

и другие.

animal, Год журнала: 2025, Номер unknown, С. 101486 - 101486

Опубликована: Март 1, 2025

Язык: Английский

Процитировано

0

Maximum vertical height during wing flapping of laying hens captured with a depth camera DOI Creative Commons

Tessa Grebey,

Valentina Bongiorno, Junjie Han

и другие.

PLoS ONE, Год журнала: 2025, Номер 20(3), С. e0312656 - e0312656

Опубликована: Март 27, 2025

Cage-free housing systems for laying hens, and their accompanying guidelines, legislation, audits, are becoming more common around the world. regulations often specify requirements floor space cage height, but availability of three-dimensional can vary depending on system configurations. Little research has looked at how much vertical a hen occupies while flapping her wings, which is arguably most space-intensive behavior. Therefore, objective this study was to use depth sensing camera measure maximum height hens reach when wing without physical obstructions. Twenty-eight individually caged Hy-line W36 45 weeks age were evaluated. A ceiling-mounted centered above test pen calibrated prior collecting data. During testing, one time placed in recorded wings. From footage, minimum distance between pixels obtained each frame, we computed reached by hen. Results used during event showed that 51.0 ± 4.7 cm. No measures taken from correlated with (P > 0.05). Hens single strain, old enough have keel damage, cage-reared housed, preventing us generalizing results too far. However, cameras provide useful approach varying strains, ages, rearing/housing methods need perform dynamic behaviors.

Язык: Английский

Процитировано

0

IMTFF-Networks: A deep learning model for cattle behavior classification integrating multimodal time-frequency features DOI
Xi Ni, Hou Zhen-jie, En Lin

и другие.

Applied Animal Behaviour Science, Год журнала: 2025, Номер unknown, С. 106627 - 106627

Опубликована: Апрель 1, 2025

Язык: Английский

Процитировано

0

Promote computer vision applications in pig farming scenarios: high-quality dataset, fundamental models, and comparable performance1 DOI Creative Commons
Jiangong Li, Xiaodan Hu, Ana Lučić

и другие.

Journal of Integrative Agriculture, Год журнала: 2024, Номер unknown

Опубликована: Авг. 1, 2024

Язык: Английский

Процитировано

3

Robotics for poultry farming: Challenges and opportunities DOI Creative Commons
Uğur Özentürk, Zhengqi Chen, Lorenzo Jamone

и другие.

Computers and Electronics in Agriculture, Год журнала: 2024, Номер 226, С. 109411 - 109411

Опубликована: Сен. 7, 2024

Язык: Английский

Процитировано

3

Mind the Step: An Artificial Intelligence-Based Monitoring Platform for Animal Welfare DOI Creative Commons

Andrea Michielon,

Paolo Litta,

Francesca Bonelli

и другие.

Sensors, Год журнала: 2024, Номер 24(24), С. 8042 - 8042

Опубликована: Дек. 17, 2024

We present an artificial intelligence (AI)-enhanced monitoring framework designed to assist personnel in evaluating and maintaining animal welfare using a modular architecture. This integrates multiple deep learning models automatically compute metrics relevant assessing well-being. Using for AI-based vision adapted from industrial applications human behavioral analysis, the includes modules markerless identification health status assessment (e.g., locomotion score body condition score). Methods analysis are also included evaluate how nutritional rearing conditions impact behaviors. These initially trained on public datasets then fine-tuned original data. demonstrate approach through two use cases: system dairy cattle piglet behavior system. The results indicate that scalable edge computing solutions can support precision livestock farming by automating assessments enabling timely, data-driven interventions.

Язык: Английский

Процитировано

3

Estimating genetic parameters of digital behavior traits and their relationship with production traits in purebred pigs DOI Creative Commons
Mary Kate Hollifield, Ching-Yi Chen, Eric Psota

и другие.

Genetics Selection Evolution, Год журнала: 2024, Номер 56(1)

Опубликована: Апрель 16, 2024

Abstract Background With the introduction of digital phenotyping and high-throughput data, traits that were previously difficult or impossible to measure directly have become easily accessible, offering opportunity enhance efficiency rate genetic gain in animal production. It is interest assess how behavioral are indirectly related production during performance testing period. The aim this study was quality behavior data extracted from day-wise video recordings estimate parameters their phenotypic correlations with pigs. Behavior recorded for 70 days after on-test at about 10 weeks age ended off-test 2008 female purebred pigs, totaling 119,812 records. included time spent eating, drinking, laterally lying, sternally sitting, standing, meters distance traveled. A control procedure created algorithm training adjustment, standardizing recording hours, removing culled animals, filtering unrealistic Results Production average daily (ADG), back fat thickness (BF), loin depth (LD). Single-trait linear models used heritabilities two-trait between traits. results indicated all heritable, heritability estimates ranging 0.19 0.57, showed low-to-moderate Two-trait also compare different intervals To analyze redundancies period, averages various compared. Overall, 55- 68-day interval had strongest correlation Conclusions Digital a new low-cost method record phenotypes, but thorough cleaning procedures needed. Evaluating offers deeper insight into changes throughout growth periods relationship traits, which may be less frequent basis.

Язык: Английский

Процитировано

2

Assessment of pig welfare at slaughterhouse level: A systematic review of animal-based indicators suitable for inclusion in monitoring protocols DOI Creative Commons

Nancy F Huanca-Marca,

Laura X. Estévez-Moreno,

Natyieli Losada Espinosa

и другие.

Meat Science, Год журнала: 2024, Номер 220, С. 109689 - 109689

Опубликована: Окт. 19, 2024

Язык: Английский

Процитировано

1