Examining speech-brain tracking during early bidirectional, free-flowing caregiver-infant interactions DOI Open Access
Emily Phillips, Louise Goupil,

James Ives

et al.

bioRxiv (Cold Spring Harbor Laboratory), Journal Year: 2024, Volume and Issue: unknown

Published: May 17, 2024

Abstract Neural entrainment to slow modulations in the amplitude envelope of infant-directed speech is thought drive early language learning. Most previous research with infants examining speech-brain tracking has been conducted controlled, experimental settings, which are far from complex environments everyday interactions. Whilst recent work begun investigate naturalistic speech, this semi-structured paradigms, where listen live adult speakers, without engaging free-flowing social Here, we test applicability mTRF modelling measure and bidirectional free-play interactions 9-12-month-olds their caregivers. Using a backwards approach, individual generic training procedures, examine effects data quantity quality on model fitting. We show fitting most optimal using an trained continuous segments interaction data. Corresponding findings, models showed significant at delta modulation frequencies, but not alpha theta bands. These findings open new methods for studying interpersonal micro-processes that support In future work, it will be important develop mechanistic framework understanding how our brains track during infancy.

Language: Английский

Neural synchronization is strongest to the spectral flux of slow music and depends on familiarity and beat salience DOI Creative Commons
Kristin Weineck,

Olivia Xin Wen,

Molly J. Henry

et al.

eLife, Journal Year: 2022, Volume and Issue: 11

Published: Sept. 12, 2022

Neural activity in the auditory system synchronizes to sound rhythms, and brain-environment synchronization is thought be fundamental successful perception. Sound rhythms are often operationalized terms of sound's amplitude envelope. We hypothesized that - especially for music envelope might not best capture complex spectro-temporal fluctuations give rise beat perception synchronized neural activity. This study investigated (1) different musical features, (2) tempo-dependence synchronization, (3) dependence on familiarity, enjoyment, ease In this electroencephalography study, 37 human participants listened tempo-modulated (1-4 Hz). Independent whether analysis approach was based temporal response functions (TRFs) or reliable components (RCA), spectral flux as opposed evoked strongest synchronization. Moreover, with slower rates, high easy-to-perceive beats elicited response. Our results demonstrate importance driving highlight its sensitivity tempo, salience.

Language: Английский

Citations

41

Attention, musicality, and familiarity shape cortical speech tracking at the musical cocktail party DOI
Jane A. Brown, Gavin M. Bidelman

Brain and Language, Journal Year: 2025, Volume and Issue: 266, P. 105581 - 105581

Published: April 25, 2025

Language: Английский

Citations

1

Neural encoding of musical expectations in a non-human primate DOI Creative Commons
Roberta Bianco, Nathaniel J. Zuk,

Félix Bigand

et al.

Current Biology, Journal Year: 2024, Volume and Issue: 34(2), P. 444 - 450.e5

Published: Jan. 1, 2024

The appreciation of music is a universal trait humankind.1Zatorre R.J. Salimpoor V.N. From perception to pleasure: and its neural substrates.Proc. Natl. Acad. Sci. USA. 2013; 110: 10430-10437Crossref PubMed Scopus (0) Google Scholar,2Singh M. Mehr S.A. Universality, domain-specificity development psychological responses music.Nat Rev Psychol. 2023; 2: 333-346Crossref (2) Scholar,3Trehub S.E. developmental origins musicality.Nat. Neurosci. 2003; 6: 669-673Crossref (302) Scholar Evidence supporting this notion includes the ubiquity across cultures4Savage P.E. Brown S. Sakai E. Currie T.E. Statistical universals reveal structures functions human music.Proc. 2015; 112: 8987-8992Crossref (263) Scholar,5Ravignani A. Delgado T. Kirby Musical evolution in lab exhibits rhythmic universals.Nat. Hum. Behav. 2017; 1: 1-7Google Scholar,6Jacoby N. Undurraga E.A. McPherson M.J. Valdés J. Ossandón McDermott J.H. Universal non-universal features musical pitch revealed by singing.Curr. Biol. 2019; 29: 3229-3243.e12Abstract Full Text PDF (53) Scholar,7Mehr Singh Knox D. Ketter D.M. Pickens-Jones Atwood Lucas C. Jacoby Egner A.A. Hopkins E.J. et al.Universality diversity song.Science. 366: 1-17Crossref (233) natural predisposition toward that humans display early development.8Edalati Wallois F. Safaie Ghostine G. Kongolo Trainor L.J. Moghimi Rhythm premature neonate brain: very processing auditory beat meter.J. 43: 2794-2802Crossref (3) Scholar,9Perani Saccuman M.C. Scifo P. Spada Andreolli Rovelli R. Baldoli Koelsch Functional specializations for newborn brain.Proc. 2010; 107: 4758-4763Crossref Scholar,10Winkler I. Háden G.P. Ladinig O. Sziller Honing H. Newborn infants detect 2009; 106: 2468-2471Crossref (370) Are we animals because species-specific predispositions? This question cannot be answered relying on cross-cultural or studies alone, as these rule out enculturation.11Hauser M.D. faculty: comparative perspective.Nat. 663-668Crossref (152) Instead, it calls cross-species experiments testing whether homologous mechanisms underlying are present non-human primates. We two rhesus monkeys, reared without exposure, while recording electroencephalography (EEG) pupillometry. Monkeys exhibit higher engagement encoding expectations based previously seeded context when passively listening real opposed shuffled controls. then compare monkey same stimuli find species-dependent contribution fundamental features—pitch timing12Krumhansl C.L. cognition.Psychol. Bull. 2000; 126: 159-179Crossref Scholar—in generating expectations: timing- pitch-based expectations13Pearce M.T. learning probabilistic prediction cognition: stylistic enculturation.Ann. Y. 2018; 1423: 378-395Crossref (78) similarly weighted humans, monkeys rely timing rather than pitch. Together, results shed light phylogeny perception. They highlight monkeys' capacity temporal beyond plain acoustic processing, they identify time- pitch-related expectations.

Language: Английский

Citations

6

Audio‐visual concert performances synchronize audience's heart rates DOI Creative Commons
Anna Czepiel, Lauren K. Fink, Mathias Scharinger

et al.

Annals of the New York Academy of Sciences, Journal Year: 2025, Volume and Issue: unknown

Published: Jan. 3, 2025

Abstract People enjoy engaging with music. Live music concerts provide an excellent option to investigate real‐world experiences, and at the same time, use neurophysiological synchrony assess dynamic engagement. In current study, we assessed engagement in a live concert setting using of cardiorespiratory measures, comparing inter‐subject, stimulus–response, correlation, phase coherence. As might be enhanced by seeing musicians perform, presented audiences audio‐only (AO) audio‐visual (AV) piano performances. Only correlation measures were above chance level. time‐averaged across conditions, AV performances evoked higher inter‐subject heart rate (ISC‐HR). However, averaged pieces did not correspond self‐reported On other hand, time‐resolved analyses show that synchronized deceleration‐acceleration (HR) patterns, typical “orienting response” (an index directed attention), occurred within salient events section boundaries. That is, perform heightened audience structurally important moments Western classical Overall, could multisensory information shapes By different further highlight advantages time series analysis, specifically ISC‐HR, as robust measure holistic musical listening experiences naturalistic settings.

Language: Английский

Citations

0

Atypical audio-visual neural synchrony and speech processing in early autism DOI Creative Commons
Xiaoyue Wang, Sophie Bouton, Nada Kojovic

et al.

Journal of Neurodevelopmental Disorders, Journal Year: 2025, Volume and Issue: 17(1)

Published: Feb. 18, 2025

Abstract Background Children with Autism Spectrum disorder (ASD) often exhibit communication difficulties that may stem from basic auditory temporal integration impairment but also be aggravated by an audio-visual deficit, resulting in a lack of interest face-to-face communication. This study addresses whether speech processing anomalies young autistic children (mean age 3.09-year-old) are associated alterations integration. Methods We used high-density electroencephalography (HD-EEG) and eye tracking to record brain activity gaze patterns 31 ASD (6 females) 33 typically developing (TD) (11 females), while they watched cartoon videos. Neural responses stimuli were analyzed using Temporal Response Functions model phase analyses for audiovisual coordination. Results The reconstructability signals was reduced compared TD, despite more restricted it similar visual both groups. Speech reception most strongly affected when information present, interference not seen TD children. These differences broader angle distribution (exceeding pi/2) the EEG theta range ASD, signaling reliability alignment. Conclusion findings show do stand alone already at very early development stage imbalance poor response encoding disrupted

Language: Английский

Citations

0

Minimal background noise enhances neural speech tracking: Evidence of stochastic resonance DOI Open Access

Björn Herrmann

Published: March 10, 2025

Neural activity in auditory cortex tracks the amplitude-onset envelope of continuous speech, but recent work counter-intuitively suggests that neural tracking increases when speech is masked by background noise, despite reduced intelligibility. Noise-related amplification could indicate stochastic resonance – response facilitation through noise supports tracking, a comprehensive account lacking. In five human electroencephalography (EEG) experiments, current study demonstrates generalized enhancement due to minimal noise. Results show a) enhanced for at very high SNRs (∼30 dB SNR) where highly intelligible; b) this independent attention; c) it generalizes across different stationary maskers, strongest 12-talker babble; and d) present headphone free-field listening, suggesting neural-tracking real-life listening. The paints clear picture enhances representation onset-envelope, contributes tracking. further highlights non-linearities induced make its use as biological marker processing challenging.

Language: Английский

Citations

0

Enhanced neural speech tracking through noise indicates stochastic resonance in humans DOI Creative Commons
Björn Herrmann

eLife, Journal Year: 2025, Volume and Issue: 13

Published: March 18, 2025

Neural activity in auditory cortex tracks the amplitude-onset envelope of continuous speech, but recent work counterintuitively suggests that neural tracking increases when speech is masked by background noise, despite reduced intelligibility. Noise-related amplification could indicate stochastic resonance – response facilitation through noise supports tracking, a comprehensive account lacking. In five human electroencephalography experiments, current study demonstrates generalized enhancement due to minimal noise. Results show (1) enhanced for at very high signal-to-noise ratios (~30 dB SNR) where highly intelligible; (2) this independent attention; (3) it generalizes across different stationary maskers, strongest 12-talker babble; and (4) present headphone free-field listening, suggesting neural-tracking real-life listening. The paints clear picture enhances representation onset-envelope, contributes tracking. further highlights non-linearities induced make its use as biological marker processing challenging.

Language: Английский

Citations

0

Sing to me, baby: Infants show neural tracking and rhythmic movements to live and dynamic maternal singing DOI Creative Commons
Trinh Nguyen, Susanne Reisner,

Anja Lueger

et al.

Developmental Cognitive Neuroscience, Journal Year: 2023, Volume and Issue: 64, P. 101313 - 101313

Published: Oct. 24, 2023

Infant-directed singing has unique acoustic characteristics that may allow even very young infants to respond the rhythms carried through caregiver's voice. The goal of this study was examine neural and movement responses live dynamic maternal in 7-month-old their relation linguistic development. In total, 60 mother-infant dyads were observed during two conditions (playsong lullaby). Study 1 (n = 30), we measured infant EEG used an encoding approach utilizing ridge regressions measure tracking. 2 =40), coded rhythmic movements. both studies, assessed children's vocabulary when they 20 months old. 1, found above-threshold tracking singing, with superior lullabies than playsongs. We also features infant-directed modulated 2, showed more playsongs lullabies. Importantly, coordination (Study 1) 2) positively related infants' expressive at months. These results highlight importance brain musical presentations, potentially as a function variability.

Language: Английский

Citations

10

Imaging the dancing brain: Decoding sensory, motor and social processes during dyadic dance DOI Creative Commons
Félix Bigand, Roberta Bianco, Sara F. Abalde

et al.

bioRxiv (Cold Spring Harbor Laboratory), Journal Year: 2024, Volume and Issue: unknown

Published: Dec. 17, 2024

Abstract Real-world social cognition requires processing and adapting to multiple dynamic information streams. Interpreting neural activity in such ecological conditions remains a key challenge for neuroscience. This study leverages advancements de-noising techniques multivariate modeling extract interpretable EEG signals from pairs of participants engaged spontaneous dyadic dance. Using temporal response functions (mTRFs), we investigated how music acoustics, self-generated kinematics, other-generated coordination each uniquely contributed activity. Electromyogram recordings ocular, face, neck muscles were also modelled control muscle artifacts. The mTRFs effectively disentangled associated with four processes: (I) auditory tracking music, (II) movements, (III) visual monitoring partner (IV) accuracy. We show that the first three are driven by event-related potentials: P50-N100-P200 triggered acoustic events, central lateralized readiness potential movement initiation, occipital N170 observation. Notably, (previously unknown) marker encodes spatiotemporal alignment between dancers, surpassing encoding self-or partner-related kinematics taken alone. emerges when partners make contact, relies on cortical areas, is specifically observation rather than initiation. data-driven kinematic decomposition, further vertical movements best drive observers’ These findings highlight real-world neuroimaging, combined modelling, uncover mechanisms underlying complex yet natural behaviors. Significance statement brain function involves integrating streams simultaneously. However, due shortfall computational methods, laboratory-based neuroscience often examines processes isolation. modelling data freely dancing demonstrate it possible tease apart physiologically-established perception, motor produced dance partner. Crucially, identify previously unknown accuracy beyond contributions biological behaviors, advancing our understanding supports interactive activities.

Language: Английский

Citations

3

Measuring self‐similarity in empirical signals to understand musical beat perception DOI Creative Commons
Tomas Lenc, Cédric Lenoir, Peter E. Keller

et al.

European Journal of Neuroscience, Journal Year: 2025, Volume and Issue: 61(2)

Published: Jan. 1, 2025

Abstract Experiencing music often entails the perception of a periodic beat. Despite being widespread phenomenon across cultures, nature and neural underpinnings beat remain largely unknown. In last decade, there has been growing interest in developing methods to probe these processes, particularly measure extent which beat‐related information is contained behavioral responses. Here, we propose theoretical framework practical implementation an analytic approach capture periodicity empirical signals using frequency‐tagging. We highlight its sensitivity measuring perceived represented range continuous time‐varying with minimal assumptions. also discuss limitation this respect specificity when restricted only from magnitude spectrum signal introduce novel extension based on autocorrelation overcome issue. test new autocorrelation‐based method simulated by re‐analyzing previously published data show how it can be used process measurements brain activity as captured surface EEG adults infants response rhythmic inputs. Taken together, related methodological advances confirm elaborate frequency‐tagging promising window into processes underlying and, more generally, temporally coordinated behaviors.

Language: Английский

Citations

0