Multimodal emotion recognition: A comprehensive review, trends, and challenges DOI

Manju Priya Arthanarisamy Ramaswamy,

Suja Palaniswamy

Wiley Interdisciplinary Reviews Data Mining and Knowledge Discovery, Journal Year: 2024, Volume and Issue: 14(6)

Published: Oct. 8, 2024

Abstract Automatic emotion recognition is a burgeoning field of research and has its roots in psychology cognitive science. This article comprehensively reviews multimodal recognition, covering various aspects such as theories, discrete dimensional models, emotional response systems, datasets, current trends. reviewed 179 literature papers from 2017 to 2023 reflect on the trends affective computing. covers modalities used based system under four categories: subjective experience comprising text self‐report; peripheral physiology electrodermal, cardiovascular, facial muscle, respiration activity; central EEG, neuroimaging, EOG; behavior facial, vocal, whole‐body behavior, observer ratings. review summarizes measures each modality states. provides an extensive list datasets their unique characteristics. The recent advances are grouped focus areas elicitation strategy, data collection handling, impact culture feature extraction, selection, alignment signals across modalities, fusion strategies. strategies detailed this article, extracting shared representations different removing redundant features learning critical crucial for recognition. strengths weaknesses outcome, along with challenges future work aims serve lucid introduction, all novices. categorized under: Fundamental Concepts Data Knowledge > Human Centricity User Interaction Technologies Cognitive Computing Artificial Intelligence

Language: Английский

MGEED: A Multimodal Genuine Emotion and Expression Detection Database DOI
Yiming Wang, Hui Yu, Weihong Gao

et al.

IEEE Transactions on Affective Computing, Journal Year: 2023, Volume and Issue: 15(2), P. 606 - 619

Published: June 15, 2023

Multimodal emotion recognition has attracted increasing interest from academia and industry in recent years, since it enables detection using various modalities, such as facial expression images, speech physiological signals. Although research this field grown rapidly, is still challenging to create a multimodal database containing electrical information due the difficulty capturing natural subtle signals, optomyography (OMG) To end, we present newly developed Genuine Emotion Expression Detection (MGEED) paper, which first publicly available OMG MGEED consists of 17 subjects with over 150K 140K depth maps different modalities signals including OMG, electroencephalography (EEG) electrocardiography (ECG) The emotions participants are evoked by video stimuli data collected sensing system. With data, an method based on signal synchronisation, feature extraction, fusion prediction. results show that superior performance can be achieved fusing visual, EEG features. obtained https://github.com/YMPort/MGEED .

Language: Английский

Citations

8

Biosignal based emotion-oriented video summarization DOI
Seyma Derdiyok, Fatma Patlar Akbulut

Multimedia Systems, Journal Year: 2023, Volume and Issue: 29(3), P. 1513 - 1526

Published: March 14, 2023

Language: Английский

Citations

7

Ethical Considerations and Checklist for Affective Research With Wearables DOI Creative Commons
Maciej Behnke, Stanisław Saganowski, Dominika Kunc

et al.

IEEE Transactions on Affective Computing, Journal Year: 2022, Volume and Issue: 15(1), P. 50 - 62

Published: Nov. 16, 2022

As the popularity of wearables increases, so does their utility for studying emotions. Using new technologies points to several ethical challenges be considered improve research designs. There are recommendations utilizing study human emotions, but they focus on emotion recognition systems applications rather than design and implementation. To address this gap, we have developed a perspective wearables, especially in daily life, adapting ReCODE Health - Digital Framework companion checklist. Therefore, our framework consists four domains: (1) participation experience, (2) privacy, (3) data management, (4) access usability. We identified 33 primary risks using including research-related negative collecting, processing, storing, sharing personal biological information, commercial technology validity reliability, exclusivity issues. also proposed possible strategies minimizing risks. consulted guidelines with members ethics committees relevant researchers. The judges ( N = 26) positively rated solutions provided useful feedback that helped us refine guidance. Finally, summarized proposals checklist researchers' convenience. Our contribute future by providing improved protection participants' scientists' interests.

Language: Английский

Citations

12

Pathos in Natural Language Argumentation: Emotional Appeals and Reactions DOI Creative Commons
Barbara Konat, Ewelina Gajewska,

Wiktoria Rossa

et al.

Argumentation, Journal Year: 2024, Volume and Issue: 38(3), P. 369 - 403

Published: June 21, 2024

Abstract In this paper, we present a model of pathos, delineate its operationalisation, and demonstrate utility through an analysis natural language argumentation. We understand pathos as interactional persuasive process in which speakers are performing appeals the audience experiences emotional reactions. analyse two strategies such pre-election debates: pathotic Argument Schemes based on taxonomy proposed by Walton et al. (Argumentation schemes, Cambridge University Press, Cambridge, 2008), emotion-eliciting psychological lexicons emotive words (Wierzba Behav Res Methods 54:2146–2161, 2021). order to match with possible reactions, collect real-time social media reactions debates apply sentiment (Alswaidan Menai Knowl Inf Syst 62:2937–2987, 2020) method observe emotion expressed language. The results point importance modern discourse: political refer emotions most their arguments, reacts those using emotion-expressing Our show that is common strategy argumentation can be analysed support computational methods.

Language: Английский

Citations

2

Multimodal emotion recognition: A comprehensive review, trends, and challenges DOI

Manju Priya Arthanarisamy Ramaswamy,

Suja Palaniswamy

Wiley Interdisciplinary Reviews Data Mining and Knowledge Discovery, Journal Year: 2024, Volume and Issue: 14(6)

Published: Oct. 8, 2024

Abstract Automatic emotion recognition is a burgeoning field of research and has its roots in psychology cognitive science. This article comprehensively reviews multimodal recognition, covering various aspects such as theories, discrete dimensional models, emotional response systems, datasets, current trends. reviewed 179 literature papers from 2017 to 2023 reflect on the trends affective computing. covers modalities used based system under four categories: subjective experience comprising text self‐report; peripheral physiology electrodermal, cardiovascular, facial muscle, respiration activity; central EEG, neuroimaging, EOG; behavior facial, vocal, whole‐body behavior, observer ratings. review summarizes measures each modality states. provides an extensive list datasets their unique characteristics. The recent advances are grouped focus areas elicitation strategy, data collection handling, impact culture feature extraction, selection, alignment signals across modalities, fusion strategies. strategies detailed this article, extracting shared representations different removing redundant features learning critical crucial for recognition. strengths weaknesses outcome, along with challenges future work aims serve lucid introduction, all novices. categorized under: Fundamental Concepts Data Knowledge > Human Centricity User Interaction Technologies Cognitive Computing Artificial Intelligence

Language: Английский

Citations

2