PortraitEmotion3D: A Novel Dataset and 3D Emotion Estimation Method for Artistic Portraiture Analysis DOI Creative Commons
Liu Shao, Sos С. Agaian, Artyom M. Grigoryan

et al.

Applied Sciences, Journal Year: 2024, Volume and Issue: 14(23), P. 11235 - 11235

Published: Dec. 2, 2024

Facial Expression Recognition (FER) has been widely explored in realistic settings; however, its application to artistic portraiture presents unique challenges due the stylistic interpretations of artists and complex interplay emotions conveyed by both artist subject. This study addresses these through three key contributions. First, we introduce PortraitEmotion3D (PE3D) dataset, designed explicitly for FER tasks portraits. dataset provides a robust foundation advancing emotion recognition visual art. Second, propose an innovative 3D estimation method that leverages three-dimensional labeling capture nuanced emotional spectrum depicted works. approach surpasses traditional two-dimensional methods enabling more comprehensive understanding subtle layered often representations. Third, enhance feature learning phase integrating self-attention module, significantly improving facial representation accuracy advancement this domain’s variations complexity, setting new benchmark Evaluation PE3D demonstrates our method’s high robustness compared existing state-of-the-art techniques. The integration module yields average improvement over 1% recent systems. Additionally, combining with ESR-9 achieves comparable 88.3% on FER+ demonstrating generalizability other benchmarks. research deepens expression art facilitates potential applications diverse fields, including human–computer interaction, security, healthcare diagnostics, entertainment industry.

Language: Английский

Eliciting Emotions: Investigating the Use of Generative AI and Facial Muscle Activation in Children’s Emotional Recognition DOI Creative Commons
Manuel A. Solis-Arrazola, Raúl E. Sánchez-Yáñez,

Ana M. S. Gonzalez-Acosta

et al.

Big Data and Cognitive Computing, Journal Year: 2025, Volume and Issue: 9(1), P. 15 - 15

Published: Jan. 20, 2025

This study explores children’s emotions through a novel approach of Generative Artificial Intelligence (GenAI) and Facial Muscle Activation (FMA). It examines GenAI’s effectiveness in creating facial images that produce genuine emotional responses children, alongside FMA’s analysis muscular activation during these expressions. The aim is to determine if AI can realistically generate recognize similar human experiences. involves generating database 280 (40 per emotion) children expressing various emotions. For real faces from public databases (DEFSS NIMH-CHEFS), five were considered: happiness, angry, fear, sadness, neutral. In contrast, for AI-generated images, seven analyzed, including the previous plus surprise disgust. A feature vector extracted indicating lengths between reference points on face contract or expand based expressed emotion. then input into an artificial neural network emotion recognition classification, achieving accuracies up 99% certain cases. offers new avenues training validating algorithms, enabling models be trained with real-world data interchangeably. integration both datasets validation phases enhances model performance adaptability.

Language: Английский

Citations

0

Multi-modal sentiment recognition with residual gating network and emotion intensity attention DOI
Yadi Wang, Xiaoding Guo,

Xianhong Hou

et al.

Neural Networks, Journal Year: 2025, Volume and Issue: unknown, P. 107483 - 107483

Published: April 1, 2025

Language: Английский

Citations

0

PortraitEmotion3D: A Novel Dataset and 3D Emotion Estimation Method for Artistic Portraiture Analysis DOI Creative Commons
Liu Shao, Sos С. Agaian, Artyom M. Grigoryan

et al.

Applied Sciences, Journal Year: 2024, Volume and Issue: 14(23), P. 11235 - 11235

Published: Dec. 2, 2024

Facial Expression Recognition (FER) has been widely explored in realistic settings; however, its application to artistic portraiture presents unique challenges due the stylistic interpretations of artists and complex interplay emotions conveyed by both artist subject. This study addresses these through three key contributions. First, we introduce PortraitEmotion3D (PE3D) dataset, designed explicitly for FER tasks portraits. dataset provides a robust foundation advancing emotion recognition visual art. Second, propose an innovative 3D estimation method that leverages three-dimensional labeling capture nuanced emotional spectrum depicted works. approach surpasses traditional two-dimensional methods enabling more comprehensive understanding subtle layered often representations. Third, enhance feature learning phase integrating self-attention module, significantly improving facial representation accuracy advancement this domain’s variations complexity, setting new benchmark Evaluation PE3D demonstrates our method’s high robustness compared existing state-of-the-art techniques. The integration module yields average improvement over 1% recent systems. Additionally, combining with ESR-9 achieves comparable 88.3% on FER+ demonstrating generalizability other benchmarks. research deepens expression art facilitates potential applications diverse fields, including human–computer interaction, security, healthcare diagnostics, entertainment industry.

Language: Английский

Citations

0