Unsupervised Canine Emotion Recognition Using Momentum Contrast DOI Creative Commons

Aarya Bhave,

Alina Hafner,

Anushka Bhave

и другие.

Sensors, Год журнала: 2024, Номер 24(22), С. 7324 - 7324

Опубликована: Ноя. 16, 2024

We describe a system for identifying dog emotions based on dogs' facial expressions and body posture. Towards that goal, we built dataset with 2184 images of ten popular breeds, grouped into seven similarly sized primal mammalian emotion categories defined by neuroscientist psychobiologist Jaak Panksepp as 'Exploring', 'Sadness', 'Playing', 'Rage', 'Fear', 'Affectionate' 'Lust'. modified the contrastive learning framework MoCo (Momentum Contrast Unsupervised Visual Representation Learning) to train it our original achieved an accuracy 43.2% baseline 14%. also trained this model second publicly available resulted in 48.46% but had 25%. compared unsupervised approach supervised ResNet50 architecture. This model, when tested labels, 74.32.

Язык: Английский

The quest to develop automated systems for monitoring animal behavior DOI Creative Commons
Janice M. Siegford, Juan P. Steibel, Junjie Han

и другие.

Applied Animal Behaviour Science, Год журнала: 2023, Номер 265, С. 106000 - 106000

Опубликована: Июль 17, 2023

Automated behavior analysis (ABA) strategies are being researched at a rapid rate to detect an array of behaviors across range species. There is growing optimism that soon ethologists will not have manually decode hours (and hours) animal videos, but instead computers process them for us. However, before we assume ABA ready practical use, it important take realistic look exactly what developed, the expertise used develop it, and context in which these studies occur. Once understand common pitfalls occurring during development identify limitations, can construct robust tools achieve automated (ultimately even continuous real time) behavioral data, allowing more detailed or longer-term on larger numbers animals than ever before. only as good trained be. A key starting point having annotated data model training assessment. most developers ethology. Often no formal ethogram developed descriptions target publications limited inaccurate. In addition, also frequently using small datasets, lack sufficient variability morphometrics, activities, camera viewpoints, environmental features be generalizable. Thus, often needs further validated satisfactorily different populations under other conditions, research purposes. Multidisciplinary teams researchers including ethicists well computer scientists, engineers needed help address problems when applying vision measure behavior. Reference datasets detection should generated shared include image annotations, baseline analyses benchmarking. Also critical standards creating such reference best practices methods validating results from ensure they At present, handful publicly available exist tools. As work realize promise subsequent precision livestock farming technologies) behavior, clear understanding practices, access accurately networking among increase our chances successes.

Язык: Английский

Процитировано

26

Automated Detection of Cat Facial Landmarks DOI Creative Commons
George Martvel, Ilan Shimshoni, Anna Zamansky

и другие.

International Journal of Computer Vision, Год журнала: 2024, Номер 132(8), С. 3103 - 3118

Опубликована: Март 5, 2024

Язык: Английский

Процитировано

14

Deep learning in multiple animal tracking: A survey DOI
Yeqiang Liu, Weiran Li,

Xue Liu

и другие.

Computers and Electronics in Agriculture, Год журнала: 2024, Номер 224, С. 109161 - 109161

Опубликована: Июнь 25, 2024

Язык: Английский

Процитировано

11

Explainable automated pain recognition in cats DOI Creative Commons
Marcelo Feighelstein, Lea Henze, Sebastian Meller

и другие.

Scientific Reports, Год журнала: 2023, Номер 13(1)

Опубликована: Июнь 2, 2023

Abstract Manual tools for pain assessment from facial expressions have been suggested and validated several animal species. However, expression analysis performed by humans is prone to subjectivity bias, in many cases also requires special expertise training. This has led an increasing body of work on automated recognition, which addressed species, including cats. Even experts, cats are a notoriously challenging species assessment. A previous study compared two approaches ‘pain’/‘no pain’ classification cat images: deep learning approach, approach based manually annotated geometric landmarks, reaching comparable accuracy results. the included very homogeneous dataset thus further research generalizability recognition more realistic settings required. addresses question whether AI models can classify (multi-breed, multi-sex) setting using heterogeneous potentially ‘noisy’ 84 client-owned Cats were convenience sample presented Department Small Animal Medicine Surgery University Veterinary Hannover individuals different breeds, ages, sex, with varying medical conditions/medical histories. scored veterinary experts Glasgow composite measure scale combination well-documented comprehensive clinical history those patients; scoring was then used training approaches. We show that this context landmark-based performs better, above 77% detection as opposed only 65% reached approach. Furthermore, we investigated explainability such machine terms identifying features important machine, revealing region nose mouth seems classification, while ears less important, these findings being consistent across techniques studied here.

Язык: Английский

Процитировано

22

Comparison between AI and human expert performance in acute pain assessment in sheep DOI Creative Commons
Marcelo Feighelstein, Stélio Pacca Loureiro Luna,

Nuno Silva

и другие.

Scientific Reports, Год журнала: 2025, Номер 15(1)

Опубликована: Янв. 3, 2025

Язык: Английский

Процитировано

1

Explainable automated recognition of emotional states from canine facial expressions: the case of positive anticipation and frustration DOI Creative Commons

Tali Boneh-Shitrit,

Marcelo Feighelstein, Annika Bremhorst

и другие.

Scientific Reports, Год журнала: 2022, Номер 12(1)

Опубликована: Дек. 30, 2022

Abstract In animal research, automation of affective states recognition has so far mainly addressed pain in a few species. Emotional remain uncharted territories, especially dogs, due to the complexity their facial morphology and expressions. This study contributes fill this gap two aspects. First, it is first address dog emotional using dataset obtained controlled experimental setting, including videos from (n = 29) Labrador Retrievers assumed be experimentally induced states: negative (frustration) positive (anticipation). The dogs’ expressions were measured Dogs Facial Action Coding System (DogFACS). Two different approaches are compared relation our aim: (1) DogFACS-based approach with two-step pipeline consisting (i) DogFACS variable detector (ii) positive/negative state Decision Tree classifier; (2) An deep learning techniques no intermediate representation. reach accuracy above 71% 89%, respectively, performing better. Secondly, also explainability AI models context emotion animals. provides decision trees, that mathematical representation which reflects previous findings by human experts certain (DogFACS variables) being correlates specific states. offers different, visual form heatmaps reflecting regions focus network’s attention, some cases show clearly related nature particular variables. These may hold key novel insights on sensitivity network nuanced pixel patterns information invisible eye.

Язык: Английский

Процитировано

28

Deep learning for video-based automated pain recognition in rabbits DOI Creative Commons
Marcelo Feighelstein,

Yamit Ehrlich,

Li Naftaly

и другие.

Scientific Reports, Год журнала: 2023, Номер 13(1)

Опубликована: Сен. 6, 2023

Abstract Despite the wide range of uses rabbits ( Oryctolagus cuniculus ) as experimental models for pain, well their increasing popularity pets, pain assessment in is understudied. This study first to address automated detection acute postoperative rabbits. Using a dataset video footage n = 28 before (no pain) and after surgery (pain), we present an AI model recognition using both facial area body posture reaching accuracy above 87%. We apply combination 1 sec interval sampling with Grayscale Short-Term stacking (GrayST) incorporate temporal information classification at frame level selection technique better exploit availability data.

Язык: Английский

Процитировано

10

Automated recognition of emotional states of horses from facial expressions DOI Creative Commons
Marcelo Feighelstein,

Claire Riccie-Bonot,

Hana Hasan

и другие.

PLoS ONE, Год журнала: 2024, Номер 19(7), С. e0302893 - e0302893

Опубликована: Июль 15, 2024

Animal affective computing is an emerging new field, which has so far mainly focused on pain, while other emotional states remain uncharted territories, especially in horses. This study the first to develop AI models automatically recognize horse from facial expressions using data collected a controlled experiment. We explore two types of pipelines: deep learning one takes as input video footage, and machine EquiFACS annotations. The former outperforms latter, with 76% accuracy separating between four states: baseline, positive anticipation, disappointment frustration. Anticipation frustration were difficult separate, only 61% accuracy.

Язык: Английский

Процитировано

4

From facial expressions to algorithms: a narrative review of animal pain recognition technologies DOI Creative Commons
Ludovica Chiavaccini, Anjali Gupta,

Guido Chiavaccini

и другие.

Frontiers in Veterinary Science, Год журнала: 2024, Номер 11

Опубликована: Июль 17, 2024

Facial expressions are essential for communication and emotional expression across species. Despite the improvements brought by tools like Horse Grimace Scale (HGS) in pain recognition horses, their reliance on human identification of characteristic traits presents drawbacks such as subjectivity, training requirements, costs, potential bias. these challenges, development facial scales animals has been making strides. To address limitations, Automated Pain Recognition (APR) powered Artificial Intelligence (AI) offers a promising advancement. Notably, computer vision machine learning have revolutionized our approach to identifying addressing non-verbal patients, including animals, with profound implications both veterinary medicine animal welfare. By leveraging capabilities AI algorithms, we can construct sophisticated models capable analyzing diverse data inputs, encompassing not only but also body language, vocalizations, physiological signals, provide precise objective evaluations an animal's levels. While advancement APR holds great promise improving welfare enabling better management, it brings forth need overcome ensure ethical practices, develop robust ground truth measures. This narrative review aimed comprehensive overview, tracing journey from initial application recent application, evolution, limitations APR, thereby contributing understanding this rapidly evolving field.

Язык: Английский

Процитировано

4

AI-Enabled Animal Behavior Analysis with High Usability: A Case Study on Open-Field Experiments DOI Creative Commons
Yuming Chen,

Tianzhe Jiao,

Jie Song

и другие.

Applied Sciences, Год журнала: 2024, Номер 14(11), С. 4583 - 4583

Опубликована: Май 27, 2024

In recent years, with the rapid development of medicine, pathology, toxicology, and neuroscience technology, animal behavior research has become essential in modern life science research. However, current mainstream commercial recognition tools only provide a single method, limiting expansion algorithms how researchers interact experimental data. To address this issue, we propose an AI-enabled, highly usable platform for analyzing behavior, which aims to better flexibility, scalability, interactivity make more usable. Researchers can flexibly select or extend different automated behaviors experience convenient human-computer interaction through natural language descriptions only. A case study at medical laboratory where was used evaluate behavioral differences between sick healthy animals demonstrated high usability platform.

Язык: Английский

Процитировано

3