Social touch modulates endogenous μ-opioid system activity in humans DOI
Lauri Nummenmaa, Lauri Tuominen, R. I. M. Dunbar

et al.

NeuroImage, Journal Year: 2016, Volume and Issue: 138, P. 242 - 247

Published: May 28, 2016

Language: Английский

Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements DOI Open Access
Lisa Feldman Barrett, Ralph Adolphs, Stacy Marsella

et al.

Psychological Science in the Public Interest, Journal Year: 2019, Volume and Issue: 20(1), P. 1 - 68

Published: July 1, 2019

It is commonly assumed that a person's emotional state can be readily inferred from his or her facial movements, typically called

Language: Английский

Citations

1339

The theory of constructed emotion: an active inference account of interoception and categorization DOI Creative Commons
Lisa Feldman Barrett

Social Cognitive and Affective Neuroscience, Journal Year: 2016, Volume and Issue: unknown, P. nsw154 - nsw154

Published: Oct. 12, 2016

The science of emotion has been using folk psychology categories derived from philosophy to search for the brain basis emotion. last two decades neuroscience research have brought us brink a paradigm shift in understanding workings brain, however, setting stage revolutionize our what emotions are and how they work. In this article, we begin with structure function there deduce biological might be. answer is brain-based, computational account called theory constructed

Language: Английский

Citations

1122

EmotionMeter: A Multimodal Framework for Recognizing Human Emotions DOI
Wei‐Long Zheng, Wei Liu, Yifei Lu

et al.

IEEE Transactions on Cybernetics, Journal Year: 2018, Volume and Issue: 49(3), P. 1110 - 1122

Published: Feb. 7, 2018

In this paper, we present a multimodal emotion recognition framework called EmotionMeter that combines brain waves and eye movements. To increase the feasibility wearability of in real-world applications, design six-electrode placement above ears to collect electroencephalography (EEG) signals. We combine EEG movements for integrating internal cognitive states external subconscious behaviors users improve accuracy EmotionMeter. The experimental results demonstrate modality fusion with deep neural networks can significantly enhance performance compared single modality, best mean 85.11% is achieved four emotions (happy, sad, fear, neutral). explore complementary characteristics their representational capacities identify has advantage classifying happy emotion, whereas outperform recognizing fear emotion. investigate stability over time, each subject performs experiments three times on different days. obtains 72.39% across sessions movement features. These effectiveness within between sessions.

Language: Английский

Citations

796

Naturalistic Stimuli in Neuroscience: Critically Acclaimed DOI
Saurabh Sonkusare, Michael Breakspear, Christine C. Guo

et al.

Trends in Cognitive Sciences, Journal Year: 2019, Volume and Issue: 23(8), P. 699 - 714

Published: June 27, 2019

Language: Английский

Citations

508

Emotion fingerprints or emotion populations? A meta-analytic investigation of autonomic features of emotion categories. DOI Creative Commons
Erika Siegel,

Molly Sands,

Wim Van Den Noortgate

et al.

Psychological Bulletin, Journal Year: 2018, Volume and Issue: 144(4), P. 343 - 393

Published: Feb. 1, 2018

The classical view of emotion hypothesizes that certain categories have a specific autonomic nervous system (ANS) "fingerprint" is distinct from other categories. Substantial ANS variation within category presumed to be epiphenomenal. theory constructed an population context-specific, highly variable instances need not share fingerprint. Instead, meaningful part the nature emotion. We present meta-analysis 202 studies measuring reactivity during lab-based inductions in nonclinical samples adults, using random effects, multilevel and multivariate pattern classification analysis test our hypotheses. found increases mean effect size for 59.4% variables across categories, but sizes did clearly distinguish 1 another. also observed significant categories; heterogeneity accounted moderate substantial percentage (i.e., I2 ≥ 30%) variability 54% these sizes. Experimental moderators epiphenomenal emotion, such as induction type (e.g., films vs. imagery), explain large portion variability. Correction publication bias reduced estimated even further, increasing These findings, when considered broader empirical literature, are more consistent with thinking principles evolutionary biology offer insights developing new hypotheses understand (PsycINFO Database Record

Language: Английский

Citations

418

Decoding the Nature of Emotion in the Brain DOI
Philip A. Kragel, Kevin S. LaBar

Trends in Cognitive Sciences, Journal Year: 2016, Volume and Issue: 20(6), P. 444 - 455

Published: May 17, 2016

Language: Английский

Citations

334

Emotion words, emotion concepts, and emotional development in children: A constructionist hypothesis. DOI Creative Commons
Katie Hoemann, Fei Xu, Lisa Feldman Barrett

et al.

Developmental Psychology, Journal Year: 2019, Volume and Issue: 55(9), P. 1830 - 1849

Published: Aug. 29, 2019

In this article, we integrate two constructionist approaches-the theory of constructed emotion and rational constructivism-to introduce several novel hypotheses for understanding emotional development. We first discuss the hypothesis that categories are abstract conceptual, whose instances share a goal-based function in particular context but highly variable their affective, physical, perceptual features. Next, possibility development is process developing concepts, words may be critical part process. hypothesize infants children learn way they other conceptual categories-by observing others use same word to label events. Finally, can understood as concept construction problem: child becomes capable experiencing perceiving only when her brain develops capacity assemble ad hoc, situated concepts purposes guiding behavior giving meaning sensory inputs. Specifically, offer predictive processing account (PsycINFO Database Record (c) 2019 APA, all rights reserved).

Language: Английский

Citations

270

Perceptual and affective mechanisms in facial expression recognition: An integrative review DOI
Manuel G. Calvo, Lauri Nummenmaa

Cognition & Emotion, Journal Year: 2015, Volume and Issue: 30(6), P. 1081 - 1106

Published: July 25, 2015

Facial expressions of emotion involve a physical component morphological changes in face and an affective conveying information about the expresser's internal feelings. It remains unresolved how much recognition discrimination rely on perception patterns or processing content. This review research role visual emotional factors expression reached three major conclusions. First, behavioral, neurophysiological, computational measures indicate that basic are reliably recognized discriminated from one another, albeit effect may be inflated by use prototypical stimuli forced-choice responses. Second, content along dimensions valence arousal is extracted early facial expressions, although this coarse representation contributes minimally to categorical specific expressions. Third, configuration saliency features contribute significantly recognition, with "emotionless" models being able reproduce some phenomena demonstrated human observers. We conclude as it has been investigated conventional laboratory tasks, depends greater extent perceptual than mechanisms.

Language: Английский

Citations

242

Social, self, (situational), and affective processes in medial prefrontal cortex (MPFC): Causal, multivariate, and reverse inference evidence DOI
Matthew D. Lieberman,

Mark A. Straccia,

Meghan L. Meyer

et al.

Neuroscience & Biobehavioral Reviews, Journal Year: 2019, Volume and Issue: 99, P. 311 - 328

Published: Jan. 2, 2019

Language: Английский

Citations

237

The Brain Basis for Misophonia DOI Creative Commons
Sukhbinder Kumar,

Olana Tansley-Hancock,

William Sedley

et al.

Current Biology, Journal Year: 2017, Volume and Issue: 27(4), P. 527 - 533

Published: Feb. 1, 2017

Misophonia is an affective sound-processing disorder characterized by the experience of strong negative emotions (anger and anxiety) in response to everyday sounds, such as those generated other people eating, drinking, chewing, breathing [1-8]. The commonplace nature these sounds (often referred "trigger sounds") makes misophonia a devastating for sufferers their families, yet nothing known about underlying mechanism. Using functional structural MRI coupled with physiological measurements, we demonstrate that misophonic subjects show specific trigger-sound-related responses brain body. Specifically, fMRI showed subjects, trigger elicit greatly exaggerated blood-oxygen-level-dependent (BOLD) anterior insular cortex (AIC), core hub "salience network" critical perception interoceptive signals emotion processing. Trigger misophonics were associated abnormal connectivity between AIC network regions responsible processing regulation emotions, including ventromedial prefrontal (vmPFC), posteromedial (PMC), hippocampus, amygdala. elicited heightened heart rate (HR) galvanic skin (GSR) which mediated activity. Questionnaire analysis perceived bodies differently: they scored higher on sensibility than controls, consistent functioning AIC. Finally, measurements implied greater myelination within vmPFC individuals. Overall, our results salience attributed particular based activation

Language: Английский

Citations

225