The priming effect of emotional words on body expressions: Two ERP studies DOI

Bixuan Du,

Shuxin Jia,

Xing Zhou

и другие.

International Journal of Psychophysiology, Год журнала: 2024, Номер 202, С. 112370 - 112370

Опубликована: Май 25, 2024

Язык: Английский

Decoding spatiotemporal features of emotional body language in social interactions DOI Creative Commons
Johannes Keck, Adam Zabicki, Julia Bachmann

и другие.

Scientific Reports, Год журнала: 2022, Номер 12(1)

Опубликована: Сен. 5, 2022

Abstract How are emotions perceived through human body language in social interactions? This study used point-light displays of interactions portraying emotional scenes (1) to examine quantitative intrapersonal kinematic and postural configurations, (2) calculate interaction-specific parameters these interactions, (3) analyze how far both contribute the perception an emotion category (i.e. anger, sadness, happiness or affection) as well valence. By using ANOVA classification trees, we investigated emotion-specific differences calculated parameters. We further applied representational similarity analyses determine perceptual ratings relate intra- interpersonal features observed scene. Results showed that within interaction, cues corresponded ratings, whereas reflected valence ratings. Perception was also driven by orientation, proxemics, time spent personal space counterpart, motion–energy balance between interacting people. Furthermore, orientation Thus, connected with content scene people make use emotionally expressive coordination infer interactions.

Язык: Английский

Процитировано

15

A large-scale brain network of species-specific dynamic human body perception DOI Creative Commons
Baichen Li, Marta Poyo Solanas, Giuseppe Marrazzo

и другие.

Progress in Neurobiology, Год журнала: 2022, Номер 221, С. 102398 - 102398

Опубликована: Дек. 21, 2022

This ultrahigh field 7 T fMRI study addressed the question of whether there exists a core network brain areas at service different aspects body perception. Participants viewed naturalistic videos monkey and human faces, bodies, objects along with mosaic-scrambled for control low-level features. Independent component analysis (ICA) based was conducted to find species modulations both voxel levels. Among areas, highest selectivity found in middle frontal gyrus amygdala. Two large-scale networks were highly selective dominated by lateral occipital cortex right superior temporal sulcus (STS) respectively. The STS showed high selectivity, its significant body-induced node connectivity focused around extrastriate area (EBA), STS, temporoparietal junction (TPJ), premotor cortex, inferior (IFG). body-specific discovered here may serve as brain-wide internal model serving an entry point variety processes relying on descriptions part their more specific categorization, action, or expression recognition functions.

Язык: Английский

Процитировано

15

Connectivity and functional diversity of different temporo-occipital nodes for action perception DOI Open Access
Baichen Li, Marta Poyo Solanas, Giuseppe Marrazzo

и другие.

bioRxiv (Cold Spring Harbor Laboratory), Год журнала: 2024, Номер unknown

Опубликована: Янв. 15, 2024

The temporo-occipital cortex (TOC) plays a key role in body and action perception, but current understanding of its functions is still limited. TOC regions are heterogeneous their perception poorly understood. This study adopted data-driven approaches to region selectivity investigated the connectivity nodes functional network sensitivity for different whole videos. In two human 7T fMRI experiments using independent component analysis, four adjacent selective were detected within with distinct profiles roles. Action type was observed posterior-ventral node visual cortex, posterior-dorsal precuneus anterior frontal cortex. specific modulations found middle gyrus aggressive condition increased decreased node. But defensive condition, node-nonspecific enhancement TOC-cingulate connectivity. By addressing issue multiple we show dissociation centres related potential hierarchy network.

Язык: Английский

Процитировано

2

Piecing together the puzzle of emotional consciousness DOI Creative Commons
Tahnée Engelen, Rocco Mennella

Neuroscience of Consciousness, Год журнала: 2023, Номер 2023(1)

Опубликована: Янв. 1, 2023

Abstract The search for neural correlates of emotional consciousness has gained momentum in the last decades. Nonetheless, disagreements concerning mechanisms that determine experiential qualities consciousness—the “what is it like” to feel an emotion—as well as on their have far-reaching consequences how researchers study and measure emotion, sometimes leading seemingly irresolvable impasses. current paper lays out a balanced way viewpoint both cognitive precognitive approaches basis commonalities differences between claims some relevant theories emotions. We examine sufficiency existing evidence support proposed by going through methodological specificity its unique challenges highlighting what can cannot be imported advances research perceptual consciousness. propose there are three key experimental contrasts each equally necessary contrast alone coming with own limitations. conclude acknowledging most promising avenues field, which may help go beyond limitations collaboratively piece together puzzle

Язык: Английский

Процитировано

6

Voxelwise encoding models of body stimuli reveal a representational gradient from low-level visual features to postural features in occipitotemporal cortex DOI Creative Commons
Giuseppe Marrazzo, Federico De Martino, Agustín Lage‐Castellanos

и другие.

NeuroImage, Год журнала: 2023, Номер 277, С. 120240 - 120240

Опубликована: Июнь 21, 2023

Previous research on body representation in the brain has focused category-specific representation, using fMRI to investigate response pattern stimuli occipitotemporal cortex. But central question of specific computations involved selective regions not been addressed so far. This study used ultra-high field and banded ridge regression computational mechanisms coding images, by comparing performance three encoding models predicting activity cortex specifically extrastriate area (EBA). Our results indicate that bodies are encoded EBA according a combination low-level visual features postural features.

Язык: Английский

Процитировано

6

Multi-view emotional expressions dataset using 2D pose estimation DOI Creative Commons
Mingming Zhang,

Yanan Zhou,

Xinye Xu

и другие.

Scientific Data, Год журнала: 2023, Номер 10(1)

Опубликована: Сен. 22, 2023

Human body expressions convey emotional shifts and intentions of action and, in some cases, are even more effective than other emotion models. Despite many datasets incorporating motion capture available, there is a lack widely distributed regarding naturalized based on the 2D video. In this paper, therefore, we report multi-view dataset (MEED) using pose estimation. Twenty-two actors presented six (anger, disgust, fear, happiness, sadness, surprise) neutral movements from three viewpoints (left, front, right). A total 4102 videos were captured. The MEED consists corresponding estimation results (i.e., 397,809 PNG files JSON files). size exceeds 150 GB. We believe will benefit research various fields, including affective computing, human-computer interaction, social neuroscience, psychiatry.

Язык: Английский

Процитировано

6

Awareness is determined by emotion and gender DOI Creative Commons

Ema Jugović,

Marta Poyo Solanas, Béatrice de Gelder

и другие.

bioRxiv (Cold Spring Harbor Laboratory), Год журнала: 2024, Номер unknown

Опубликована: Фев. 29, 2024

Summary Traditionally, consciousness studies focus on domain general cognitive processes rather than specific information reaching subjective awareness. The present study ( N = 45) used visual masking and whole-body images to investigate whether the emotional expression as well gender of stimuli participants impact Our results show that participants’ awareness responses reflect differences in emotion stimuli, these are a function minimal may be associated with features body images. Overall, we observed threatening expressions more easily detected fearful ones, especially by males presented male stimuli. findings underscore importance affective factors for theories significance processing, often overlooked past face recognition studies.

Язык: Английский

Процитировано

2

Neural Encoding of Bodies for Primate Social Perception DOI
Etienne Abassi, Anna Bognár, Béatrice de Gelder

и другие.

Journal of Neuroscience, Год журнала: 2024, Номер 44(40), С. e1221242024 - e1221242024

Опубликована: Окт. 2, 2024

Primates, as social beings, have evolved complex brain mechanisms to navigate intricate environments. This review explores the neural bases of body perception in both human and nonhuman primates, emphasizing processing signals conveyed by postures, movements, interactions. Early studies identified selective responses stimuli macaques, particularly within ventral superior temporal sulcus (STS). These regions, known patches, represent visual features that are present bodies but do not appear be semantic detectors. They provide information about posture viewpoint body. Recent research using dynamic has expanded understanding body-selective network, highlighting its complexity interplay between static processing. In humans, areas such extrastriate area (EBA) fusiform (FBA) been implicated their Moreover, on interactions reveal regions STS also tuned dyadic interactions, suggesting a specialized lateral pathway. Computational work developed models recognition interaction, providing insights into underlying mechanisms. Despite advances, significant gaps remain interaction. Overall, this underscores importance integrating findings across species comprehensively understand foundations interaction computational modeling recording.

Язык: Английский

Процитировано

2

The characterization of actions at the superordinate, basic and subordinate level DOI Creative Commons
Tonghe Zhuang, Angelika Lingnau

Psychological Research, Год журнала: 2021, Номер 86(6), С. 1871 - 1891

Опубликована: Дек. 14, 2021

Objects can be categorized at different levels of abstraction, ranging from the superordinate (e.g., fruit) and basic apple) to subordinate level golden delicious). The is assumed play a key role in categorization, e.g., terms number features used describe these actions speed processing. To which degree do principles also apply categorization observed actions? address this question, we first selected range locomotion), swim) swim breaststroke), using verbal material (Experiments 1-3). Experiments 4-6 aimed determine characteristics across three taxonomic levels. Using feature listing paradigm (Experiment 4), determined that were provided by least six out twenty participants (common features), separately for In addition, examined shared (i.e., more than one category) distinct category only) features. Participants produced highest common level. Actions with other same those described compared an auditory priming 5), responded faster action images preceded matching cue corresponding level, but not cues, suggesting most abstract cues facilitate processing upcoming action. verification task 6), found accurate verify categories (depicted as images) comparison Together, line object literature, our results suggest information about maximized

Язык: Английский

Процитировано

12

Correlated expression of the body, face, and voice during character portrayal in actors DOI Creative Commons
Matthew Berry,

Sarah Lewin,

Steven Brown

и другие.

Scientific Reports, Год журнала: 2022, Номер 12(1)

Опубликована: Май 18, 2022

Abstract Actors are required to engage in multimodal modulations of their body, face, and voice order create a holistic portrayal character during performance. We present here the first trimodal analysis, our knowledge, process professional actors. The actors portrayed series stock characters (e.g., king, bully) that were organized according predictive scheme based on two orthogonal personality dimensions assertiveness cooperativeness. used 3D motion capture technology analyze relative expansion/contraction 6 body segments across head, torso, arms, hands. compared this with previous results for these portrayals 4 facial expression vocal parameters pitch loudness. demonstrated significant cross-modal correlations (but not cooperativeness), as manifested collectively straightening head expansion arms hands, lowering jaw, rise These demonstrate what communication theorists refer “multichannel reinforcement”. discuss reinforcement light both acting theories human more generally.

Язык: Английский

Процитировано

7