The Effect of Emotional Intelligence on the Accuracy of Facial Expression Recognition in the Valence–Arousal Space DOI Open Access
Yubin Kim, Ayoung Cho, Hyun–Woo Lee

et al.

Electronics, Journal Year: 2025, Volume and Issue: 14(8), P. 1525 - 1525

Published: April 9, 2025

Facial expression recognition (FER) plays a pivotal role in affective computing and human–computer interaction by enabling machines to interpret human emotions. However, conventional FER models often overlook individual differences emotional intelligence (EI), which may significantly influence how emotions are perceived expressed. This study investigates the effect of EI on facial accuracy within valence–arousal space. Participants were divided into high low groups based composite score derived from Tromsø Social Intelligence Scale performance-based emotion tasks. Five deep learning (EfficientNetV2-L/S, MaxViT-B/T, VGG16) trained AffectNet dataset evaluated using data collected participants. Emotional states predicted as continuous valence arousal values, then mapped onto discrete categories for interpretability. The results indicated that individuals with higher achieved greater accuracy, particularly requiring contextual understanding (e.g., anger, sadness, happiness), while fear was better recognized lower EI. These findings highlight modulating performance suggest integrating EI-related features valence–arousal-based could enhance adaptiveness systems.

Language: Английский

The Effect of Emotional Intelligence on the Accuracy of Facial Expression Recognition in the Valence–Arousal Space DOI Open Access
Yubin Kim, Ayoung Cho, Hyun–Woo Lee

et al.

Electronics, Journal Year: 2025, Volume and Issue: 14(8), P. 1525 - 1525

Published: April 9, 2025

Facial expression recognition (FER) plays a pivotal role in affective computing and human–computer interaction by enabling machines to interpret human emotions. However, conventional FER models often overlook individual differences emotional intelligence (EI), which may significantly influence how emotions are perceived expressed. This study investigates the effect of EI on facial accuracy within valence–arousal space. Participants were divided into high low groups based composite score derived from Tromsø Social Intelligence Scale performance-based emotion tasks. Five deep learning (EfficientNetV2-L/S, MaxViT-B/T, VGG16) trained AffectNet dataset evaluated using data collected participants. Emotional states predicted as continuous valence arousal values, then mapped onto discrete categories for interpretability. The results indicated that individuals with higher achieved greater accuracy, particularly requiring contextual understanding (e.g., anger, sadness, happiness), while fear was better recognized lower EI. These findings highlight modulating performance suggest integrating EI-related features valence–arousal-based could enhance adaptiveness systems.

Language: Английский

Citations

0