Monitoring poultry social dynamics using colored tags: avian visual perception, behavioral effects, and artificial intelligence precision DOI Creative Commons

Florencia Rossi,

Nicola De Rossi,

Gabriel Orso

et al.

Poultry Science, Journal Year: 2024, Volume and Issue: 104(1), P. 104464 - 104464

Published: Nov. 5, 2024

Artificial intelligence (AI) in animal behavior and welfare research is on the rise. AI can detect behaviors localize animals video recordings, thus it a valuable tool for studying social dynamics. However, maintaining identity of individuals over time, especially homogeneous poultry flocks, remains challenging algorithms. We propose using differentially colored "backpack" tags (black, gray, white, orange, red, purple, green) detectable with computer vision (eg. YOLO) from top-view recordings pens. These also accommodate sensors, such as accelerometers. In separate experiments, we aim to: (i) evaluate avian visual perception different tags; (ii) assess potential impact tag colors behavior; (iii) test ability YOLO model to accurately distinguish between Japanese quail group settings. First, reflectance spectra feathers were measured. An was applied calculate quantum catches each spectrum. Green purple showed significant chromatic contrast feather. Mostly presented greater luminance receptor stimulation than feathers. Birds wearing green pecked significantly more at their own those black (control) tags. Additionally, fewer aggressive interactions observed groups orange compared other colors, except red. Next, heterogeneous 5 birds color videorecorded 1 h. The precision accuracy assessed, yielding values 95.9% 97.3%, respectively, most errors stemming misclassifications gray Lastly output, estimated bird's average distance, locomotion speed, percentage time spent moving. No behavioral differences associated detected. conclusion, carefully selected backpack be identified models hold making them powerful tools studies.

Language: Английский

Wearables in Chronomedicine and Interpretation of Circadian Health DOI Creative Commons
Denis Gubin,

Dietmar Weinert,

Oliver Stefani

et al.

Diagnostics, Journal Year: 2025, Volume and Issue: 15(3), P. 327 - 327

Published: Jan. 30, 2025

Wearable devices have gained increasing attention for use in multifunctional applications related to health monitoring, particularly research of the circadian rhythms cognitive functions and metabolic processes. In this comprehensive review, we encompass how wearables can be used study disease. We highlight importance these as markers well-being potential predictors outcomes. focus on wearable technologies sleep research, medicine, chronomedicine beyond domain emphasize actigraphy a validated tool monitoring sleep, activity, light exposure. discuss various mathematical methods currently analyze actigraphic data, such parametric non-parametric approaches, linear, non-linear, neural network-based applied quantify non-circadian variability. also introduce novel actigraphy-derived markers, which personalized proxies status, assisting discriminating between disease, offering insights into neurobehavioral status. lifestyle factors physical activity exposure modulate brain health. establishing reference standards measures further refine data interpretation improve clinical The review calls existing tools methods, deepen our understanding health, develop healthcare strategies.

Language: Английский

Citations

2

Monitoring poultry social dynamics using colored tags: avian visual perception, behavioral effects, and artificial intelligence precision DOI Creative Commons

Florencia Rossi,

Nicola De Rossi,

Gabriel Orso

et al.

Poultry Science, Journal Year: 2024, Volume and Issue: 104(1), P. 104464 - 104464

Published: Nov. 5, 2024

Artificial intelligence (AI) in animal behavior and welfare research is on the rise. AI can detect behaviors localize animals video recordings, thus it a valuable tool for studying social dynamics. However, maintaining identity of individuals over time, especially homogeneous poultry flocks, remains challenging algorithms. We propose using differentially colored "backpack" tags (black, gray, white, orange, red, purple, green) detectable with computer vision (eg. YOLO) from top-view recordings pens. These also accommodate sensors, such as accelerometers. In separate experiments, we aim to: (i) evaluate avian visual perception different tags; (ii) assess potential impact tag colors behavior; (iii) test ability YOLO model to accurately distinguish between Japanese quail group settings. First, reflectance spectra feathers were measured. An was applied calculate quantum catches each spectrum. Green purple showed significant chromatic contrast feather. Mostly presented greater luminance receptor stimulation than feathers. Birds wearing green pecked significantly more at their own those black (control) tags. Additionally, fewer aggressive interactions observed groups orange compared other colors, except red. Next, heterogeneous 5 birds color videorecorded 1 h. The precision accuracy assessed, yielding values 95.9% 97.3%, respectively, most errors stemming misclassifications gray Lastly output, estimated bird's average distance, locomotion speed, percentage time spent moving. No behavioral differences associated detected. conclusion, carefully selected backpack be identified models hold making them powerful tools studies.

Language: Английский

Citations

0