Resting-state functional connectivity changes following audio-tactile speech training
Frontiers in Neuroscience,
Journal Year:
2025,
Volume and Issue:
19
Published: April 29, 2025
Understanding
speech
in
background
noise
is
a
challenging
task,
especially
when
the
signal
also
distorted.
In
series
of
previous
studies,
we
have
shown
that
comprehension
can
improve
if,
simultaneously
with
auditory
speech,
person
receives
speech-extracted
low-frequency
signals
on
their
fingertips.
The
effect
increases
after
short
audio-tactile
training.
this
study,
used
resting-state
functional
magnetic
resonance
imaging
(rsfMRI)
to
measure
spontaneous
oscillations
brain
while
at
rest
assess
training-induced
changes
connectivity.
We
observed
enhanced
connectivity
(FC)
within
right-hemisphere
cluster
corresponding
middle
temporal
motion
area
(MT),
extrastriate
body
(EBA),
and
lateral
occipital
cortex
(LOC),
which,
before
training,
was
found
be
more
connected
bilateral
dorsal
anterior
insula.
Furthermore,
early
visual
areas
demonstrated
switch
from
increased
training
sensory/multisensory
association
parietal
hub,
contralateral
palm
receiving
vibrotactile
inputs,
addition,
right
sensorimotor
cortex,
including
finger
representations,
internally
results
altogether
interpreted
two
main
complementary
frameworks.
first,
speech-specific,
factor
relates
pre-existing
for
audio-visual
processing,
visual,
motion,
regions
involved
lip-reading
gesture
analysis
under
difficult
acoustic
conditions,
upon
which
new
network
might
built.
other
framework
refers
spatial/body
awareness
integration,
both
are
necessary
performing
revealed
insular
regions.
It
possible
an
extended
period
directly
strengthen
connections
between
utterly
novel
multisensory
task.
contribute
better
understanding
largely
unknown
neuronal
mechanisms
underlying
tactile
benefits
may
relevant
rehabilitation
hearing-impaired
population.
Language: Английский
Resting-state functional connectivity changes following audio-tactile speech training
bioRxiv (Cold Spring Harbor Laboratory),
Journal Year:
2024,
Volume and Issue:
unknown
Published: Oct. 26, 2024
Abstract
Understanding
speech
in
background
noise
is
a
challenging
task,
especially
if
the
signal
also
distorted.
In
series
of
previous
studies
we
have
shown
that
comprehension
can
improve
simultaneously
to
auditory
speech,
person
receives
speech-extracted
low-frequency
signals
on
fingertips.
The
effect
increases
after
short
audio-tactile
training.
Here
use
resting-state
functional
magnetic
resonance,
measuring
spontaneous
oscillations
brain
while
at
rest,
assess
training-induced
changes
connectivity.
We
show
enhanced
connectivity
within
right-hemisphere
cluster
encompassing
middle
temporal
motion
area
(MT),
and
extrastriate
body
(EBA),
lateral
occipital
cortex
(LOC),
which
before
training
found
be
more
connected
bilateral
dorsal
anterior
insula.
Furthermore,
early
visual
areas
are
switch
from
increased
with
before,
an
association
sensory/multisensory
parietal
hub,
contralateral
palm
receiving
vibrotactile
inputs,
after.
Also
right
sensorimotor
cortex,
including
finger
representations,
internally
results
alltogether
interpreted
two
main
complementary
frameworks.
One,
speech-specific,
relates
pre-existing
for
audio-visual
processing,
visual,
regions
lip-reading
gesture
analysis
difficult
acoustic
conditions,
new
network
might
built
upon.
other
refers
spatial/body
awareness
integration,
revealed
insular
regions.
It
possible
extended
period
may
necessary
effectively
strengthen
direct
connections
between
regions,
utterly
novel
task.
outcomes
study
relevant
both
basic
neuroscience,
as
well
development
rehabilitation
tools
hearing
impaired
population.
Language: Английский