Neural Representations of Emotions in Visual, Auditory, and Modality‐Independent Regions Reflect Idiosyncratic Conceptual Knowledge DOI Creative Commons
Chuanji Gao, Sewon Oh, Xuan Yang

et al.

Human Brain Mapping, Journal Year: 2024, Volume and Issue: 45(14)

Published: Oct. 1, 2024

ABSTRACT Growing evidence suggests that conceptual knowledge influences emotion perception, yet the neural mechanisms underlying this effect are not fully understood. Recent studies have shown brain representations of facial categories in visual‐perceptual areas predicted by knowledge, but it remains to be seen if auditory regions similarly affected. Moreover, is clear whether these operate at a modality‐independent level. To address questions, we conducted functional magnetic resonance imaging study presenting participants with both and vocal emotional stimuli. This dual‐modality approach allowed us investigate effects on modality‐specific regions. Using univariate representational similarity analyses, found visual (middle lateral occipital cortices) (superior temporal gyrus) were understanding emotions for faces voices, respectively. Additionally, discovered also influenced supra‐modal superior sulcus. Dynamic causal modeling revealed network showing bottom‐up top‐down flows, suggesting complex interplay processing. These findings collectively indicate sensory‐perceptual likely shaped each individual's knowledge.

Language: Английский

Affect in the dark: Navigating the complex landscape of social cognition in blindness DOI

Veronica Domenici,

Olivier Collignon,

Giada Lettieri

et al.

Progress in brain research, Journal Year: 2025, Volume and Issue: unknown

Published: Jan. 1, 2025

Language: Английский

Citations

0

Brain Networks that Experience Virtual Nature: Cognitive Pre-tuning Due to Emotional Intelligence DOI
О. М. Разумникова,

Artem Davidov,

Maxim Bakaev

et al.

Studies in computational intelligence, Journal Year: 2025, Volume and Issue: unknown, P. 232 - 243

Published: Jan. 1, 2025

Language: Английский

Citations

0

Lights, Camera, Emotion: REELMO’s 1060 Hours of Affective Reports to Explore Emotions in Naturalistic Contexts DOI Creative Commons
Erika Sampaolo, Giacomo Handjaras, Giada Lettieri

et al.

Scientific Data, Journal Year: 2025, Volume and Issue: 12(1)

Published: May 15, 2025

Emotions are central to human experience, yet their complexity and context-dependent nature challenge traditional laboratory studies. We present REELMO (REal-time EmotionaL responses MOvies), a novel dataset bridging controlled experiments naturalistic affective experiences. includes 1,060 hours of moment-by-moment emotional reports across 20 states collected during the viewing 60 full-length movies, along with additional measures personality traits, empathy, movie synopses, overall liking from 161 participants. It also features fMRI data volunteers recorded while watching Jojo Rabbit. Complemented by visual acoustic as well semantic content derived deep-learning models, provides comprehensive platform for advancing emotion research. Its high temporal resolution, rich annotations, integration enable investigations into interplay between sensory information, narrative structures, contextual factors in shaping experiences, study chronometry, mixed-valence states, psychological trait influences, machine learning applications (neuro)science.

Language: Английский

Citations

0

Identifying the hierarchical emotional areas in the human brain through information fusion DOI
Zhongyu Huang, Changde Du, Chaozhuo Li

et al.

Information Fusion, Journal Year: 2024, Volume and Issue: 113, P. 102613 - 102613

Published: Aug. 2, 2024

Language: Английский

Citations

3

Cognition, emotion, and the default mode network DOI
Nicola Sambuco

Brain and Cognition, Journal Year: 2024, Volume and Issue: 182, P. 106229 - 106229

Published: Oct. 31, 2024

Language: Английский

Citations

3

Awe is characterized as an ambivalent experience in the human behavior and cortex: integrated virtual reality-electroencephalogram study DOI Creative Commons
Jinwoo Yi, Danny Dongyeop Han,

S. Oh

et al.

bioRxiv (Cold Spring Harbor Laboratory), Journal Year: 2024, Volume and Issue: unknown

Published: Aug. 19, 2024

Abstract Ambivalent feelings are a defining feature of awe, which has been understood as possible source its psychosocial benefits. However, due to the conventional unidimensional model affective valence, behavior and neural representation ambivalent during awe remain elusive. To address this gap, we combined awe-inducing virtual reality clips, electroencephalogram, deep learning-based dimensionality reduction technique ( N = 43). Behaviorally, ratings were precisely predicted by duration intensity feelings, not single valence-related metrics. In electrophysiological analysis, identified latent space for each participant sharing valence structures across individuals stimuli. these spaces, distinctly represented from positive negative ones, variability in their distinctiveness specifically ratings. Additionally, frontal delta oscillations mainly engaged differentiating representations. Our findings demonstrate that is fundamentally an experience reflected both activities. This work provides new framework understanding complex emotions underpinnings, with potential implications neuroscience relevant fields.

Language: Английский

Citations

0

Brain Networks that Experience Virtual Nature: Cognitive Pre-tuning Due to Emotional Intelligence DOI
О. М. Разумникова,

Artem Davidov,

Maxim Bakaev

et al.

Published: Jan. 1, 2024

Language: Английский

Citations

0

Neural Representations of Emotions in Visual, Auditory, and Modality‐Independent Regions Reflect Idiosyncratic Conceptual Knowledge DOI Creative Commons
Chuanji Gao, Sewon Oh, Xuan Yang

et al.

Human Brain Mapping, Journal Year: 2024, Volume and Issue: 45(14)

Published: Oct. 1, 2024

ABSTRACT Growing evidence suggests that conceptual knowledge influences emotion perception, yet the neural mechanisms underlying this effect are not fully understood. Recent studies have shown brain representations of facial categories in visual‐perceptual areas predicted by knowledge, but it remains to be seen if auditory regions similarly affected. Moreover, is clear whether these operate at a modality‐independent level. To address questions, we conducted functional magnetic resonance imaging study presenting participants with both and vocal emotional stimuli. This dual‐modality approach allowed us investigate effects on modality‐specific regions. Using univariate representational similarity analyses, found visual (middle lateral occipital cortices) (superior temporal gyrus) were understanding emotions for faces voices, respectively. Additionally, discovered also influenced supra‐modal superior sulcus. Dynamic causal modeling revealed network showing bottom‐up top‐down flows, suggesting complex interplay processing. These findings collectively indicate sensory‐perceptual likely shaped each individual's knowledge.

Language: Английский

Citations

0