How does three-dimensional landscape pattern affect urban residents' sentiments DOI
Wenning Li, Ranhao Sun, Hongbin He

et al.

Cities, Journal Year: 2023, Volume and Issue: 143, P. 104619 - 104619

Published: Oct. 17, 2023

Language: Английский

Design of subject independent 3D VAD emotion detection system using EEG signals and machine learning algorithms DOI

Durgesh Nandini,

Jyoti Yadav, Asha Rani

et al.

Biomedical Signal Processing and Control, Journal Year: 2023, Volume and Issue: 85, P. 104894 - 104894

Published: April 5, 2023

Language: Английский

Citations

23

Detection of human emotions through facial expressions using hybrid convolutional neural network-recurrent neural network algorithm DOI Creative Commons

Haposan Vincentius Manalu,

Achmad Pratama Rifai

Intelligent Systems with Applications, Journal Year: 2024, Volume and Issue: 21, P. 200339 - 200339

Published: Feb. 9, 2024

Cognitive science plays a pivotal role in deciphering human behavior by understanding and interpreting emotions prevalent everyday life. These manifest through various cues, including speech patterns, body language, notably, facial expressions. Human expressions serve as fundamental mode of communication interaction. Within the realm computer vision, Facial Expression Recognition (FER) stands crucial field, offering diverse techniques to decode from This research aims develop hybrid Convolutional Neural Network – Recurrent (CNN-RNN) model adept at detecting derived based on video data. The models are developed Emotional Wearable Dataset 2020. dataset consists several expressions, four them - amusement, enthusiasm, awe, liking has never been explored previous datasets. expansion provides more comprehensive approach emotion detection. Three MobileNetV2-RNN, InceptionV3-RNN, custom CNN-RNN for classification. achieved an accuracy rate 63%, while MobileNetV2-RNN InceptionV3-RNN transfer learning yield 59% 66%, respectively. demonstrate enhanced efficiency distinguishing these nuanced emotions, significant advancement field expression recognition. holds substantial implications cognitive real-world applications, particularly enhancing interactive digital emotional analysis.

Language: Английский

Citations

14

Image-based facial emotion recognition using convolutional neural network on emognition dataset DOI Creative Commons

Erlangga Satrio Agung,

Achmad Pratama Rifai, Titis Wijayanto

et al.

Scientific Reports, Journal Year: 2024, Volume and Issue: 14(1)

Published: June 23, 2024

Detecting emotions from facial images is difficult because expressions can vary significantly. Previous research on using deep learning models to classify has been carried out various datasets that contain a limited range of expressions. This study expands the use for emotion recognition (FER) based Emognition dataset includes ten target emotions: amusement, awe, enthusiasm, liking, surprise, anger, disgust, fear, sadness, and neutral. A series data preprocessing was convert video into augment data. proposes Convolutional Neural Network (CNN) built through two approaches, which are transfer (fine-tuned) with pre-trained Inception-V3 MobileNet-V2 building scratch Taguchi method find robust combination hyperparameters setting. The proposed model demonstrated favorable performance over experimental processes an accuracy average F1-score 96% 0.95, respectively, test

Language: Английский

Citations

10

The competitive esports physiological, affective, and video dataset DOI Creative Commons
Maciej Behnke,

Wadim Krzyżaniak,

Jan A. Nowak

et al.

Scientific Data, Journal Year: 2025, Volume and Issue: 12(1)

Published: Jan. 11, 2025

Abstract Esports refers to competitive video gaming where individuals compete against each other in organized tournaments for prize money. Here, we present the Competitive Physiological, Affective, and Video (CEPAV) dataset, which 300 male Counter Strike: Global Offensive gamers participated a study aimed at optimizing affect during esports tournament 1 . The CEPAV dataset includes (1) physiological data, capturing player’s cardiovascular responses from before, during, after over 3000 CS: GO matches; (2) self-reported affective detailing states experienced before gameplay; (3) providing visual record of 552 in-laboratory sessions. We also collected (affect-related) individual differences measures (e.g., well-being, ill-being) across six weeks three waves. data gamers’ natural language descriptions situations. provides comprehensive resource researchers analysts seeking understand complex interplay physiological, affective, behavioral factors performance contexts.

Language: Английский

Citations

1

A Review of AI Cloud and Edge Sensors, Methods, and Applications for the Recognition of Emotional, Affective and Physiological States DOI Creative Commons
Artūras Kaklauskas, Ajith Abraham, Ieva Ubartė

et al.

Sensors, Journal Year: 2022, Volume and Issue: 22(20), P. 7824 - 7824

Published: Oct. 14, 2022

Affective, emotional, and physiological states (AFFECT) detection recognition by capturing human signals is a fast-growing area, which has been applied across numerous domains. The research aim to review publications on how techniques that use brain biometric sensors can be used for AFFECT recognition, consolidate the findings, provide rationale current methods, compare effectiveness of existing quantify likely they are address issues/challenges in field. In efforts achieve key goals Society 5.0, Industry human-centered design better, affective, progressively becoming an important matter offers tremendous growth knowledge progress these other related fields. this research, sensors, applications was performed, based Plutchik’s wheel emotions. Due immense variety sensing systems, study aimed analysis available define AFFECT, classify them type area their efficiency real implementations. Based statistical multiple criteria 169 nations, our outcomes introduce connection between nation’s success, its number Web Science articles published, frequency citation recognition. principal conclusions present contributes big picture field under explore forthcoming trends.

Language: Английский

Citations

31

A Large Finer-grained Affective Computing EEG Dataset DOI Creative Commons
Jingjing Chen, Xiaobin Wang, Chen Huang

et al.

Scientific Data, Journal Year: 2023, Volume and Issue: 10(1)

Published: Oct. 25, 2023

Affective computing based on electroencephalogram (EEG) has gained increasing attention for its objectivity in measuring emotional states. While positive emotions play a crucial role various real-world applications, such as human-computer interactions, the state-of-the-art EEG datasets have primarily focused negative emotions, with less consideration given to emotions. Meanwhile, these usually relatively small sample size, limiting exploration of important issue cross-subject affective computing. The proposed Finer-grained Computing Dataset (FACED) aimed address issues by recording 32-channel signals from 123 subjects. During experiment, subjects watched 28 emotion-elicitation video clips covering nine emotion categories (amusement, inspiration, joy, tenderness; anger, fear, disgust, sadness, and neutral emotion), providing fine-grained balanced categorization both sides emotion. validation results show that can be effectively recognized at intra-subject levels. FACED dataset is expected contribute developing EEG-based algorithms applications.

Language: Английский

Citations

20

Emotion Rendering for Conversational Speech Synthesis with Heterogeneous Graph-Based Context Modeling DOI Open Access
Rui Liu,

Yifan Hu,

Yi Ren

et al.

Proceedings of the AAAI Conference on Artificial Intelligence, Journal Year: 2024, Volume and Issue: 38(17), P. 18698 - 18706

Published: March 24, 2024

Conversational Speech Synthesis (CSS) aims to accurately express an utterance with the appropriate prosody and emotional inflection within a conversational setting. While recognising significance of CSS task, prior studies have not thoroughly investigated expressiveness problems due scarcity datasets difficulty stateful emotion modeling. In this paper, we propose novel model, termed ECSS, that includes two main components: 1) enhance understanding, introduce heterogeneous graph-based context modeling mechanism, which takes multi-source dialogue history as input model learn cues from context; 2) achieve rendering, employ contrastive learning-based renderer module infer accurate style for target utterance. To address issue data scarcity, meticulously create labels in terms category intensity, annotate additional information on existing dataset (DailyTalk). Both objective subjective evaluations suggest our outperforms baseline models understanding rendering emotions. These also underscore importance comprehensive annotations. Code audio samples can be found at: https://github.com/walker-hyf/ECSS.

Language: Английский

Citations

6

Ethical considerations for integrating multimodal computer perception and neurotechnology DOI Creative Commons
Meghan E. Hurley, Anika Sonig, John D. Herrington

et al.

Frontiers in Human Neuroscience, Journal Year: 2024, Volume and Issue: 18

Published: Feb. 16, 2024

Artificial intelligence (AI)-based computer perception technologies (e.g., digital phenotyping and affective computing) promise to transform clinical approaches personalized care in psychiatry beyond by offering more objective measures of emotional states behavior, enabling precision treatment, diagnosis, symptom monitoring. At the same time, passive continuous nature which they often collect data from patients non-clinical settings raises ethical issues related privacy self-determination. Little is known about how such concerns may be exacerbated integration neural data, as parallel advances perception, AI, neurotechnology enable new insights into subjective states. Here, we present findings a multi-site NCATS-funded study considerations for translating contextualize them within neuroethics neurorights literatures.

Language: Английский

Citations

4

Physiological data for affective computing in HRI with anthropomorphic service robots: the AFFECT-HRI data set DOI Creative Commons
Judith S. Heinisch, Jérôme Kirchhoff, Philip Busch

et al.

Scientific Data, Journal Year: 2024, Volume and Issue: 11(1)

Published: April 4, 2024

Abstract In human-human and human-robot interaction, the counterpart influences human’s affective state. Contrary to humans, robots inherently cannot respond empathically, meaning non-beneficial reactions be mitigated. Thus, create a responsible empathetic interaction (HRI), involving anthropomorphic service robots, effect of robot behavior on human affect in HRI must understood. To contribute this understanding, we provide new comprehensive data set AFFECT-HRI, including, for first time, physiological labeled with (i.e., emotions mood) gathered from conducted study. Within study, 146 participants interacted an realistic complex retail scenario. The participants’ questionnaire ratings regarding affect, demographics, socio-technical are provided set. Five different conditions neutral , transparency liability moral immoral ) were considered during eliciting allowing interdisciplinary investigations (e.g., computer science, law, psychology). Each condition includes three scenes: consultation products, request sensitive personal information, handover.

Language: Английский

Citations

4

Federated learning in Emotion Recognition Systems based on physiological signals for privacy preservation: a review DOI
Neha Gahlan, Divyashikha Sethia

Multimedia Tools and Applications, Journal Year: 2024, Volume and Issue: unknown

Published: June 3, 2024

Language: Английский

Citations

4