bioRxiv (Cold Spring Harbor Laboratory),
Journal Year:
2022,
Volume and Issue:
unknown
Published: Dec. 5, 2022
Abstract
Auditory
perception
can
benefit
from
stimuli
in
non-auditory
sensory
modalities,
as
for
example
lip-reading.
Compared
with
such
visual
influences,
tactile
influences
are
still
poorly
understood.
It
has
been
shown
that
single
pulses
enhance
the
of
auditory
depending
on
their
relative
timing,
but
whether
and
how
brief
enhancements
be
stretched
time
more
sustained,
phase-specific
periodic
stimulation
is
unclear.
To
address
this
question,
we
presented
fluctuated
coherently
continuously
at
4Hz
an
noise
(either
in-phase
or
anti-phase)
assessed
its
effect
cortical
processing
signal
embedded
noise.
Scalp-electroencephalography
recordings
revealed
enhancing
responses
phase-locked
to
a
suppressive
anti-phase
evoked
by
signal.
Although
these
effects
appeared
follow
well-known
principles
multisensory
integration
discrete
audio-tactile
events,
they
were
not
accompanied
corresponding
behavioral
measures
perception.
Our
results
indicate
continuous
acoustically-induced
fluctuations
mask
ongoing
They
further
suggest
sustained
insufficient
inducing
bottom-up
benefits.
Neuromodulation Technology at the Neural Interface,
Journal Year:
2023,
Volume and Issue:
26(8), P. 1549 - 1584
Published: Jan. 31, 2023
Transcranial
alternating
current
stimulation
(tACS)
has
been
one
of
numerous
investigation
methods
used
for
their
potential
to
modulate
brain
oscillations;
however,
such
investigations
have
given
contradictory
results
and
a
lack
standardization.In
this
systematic
review,
we
aimed
assess
the
tACS
alpha
spectral
power.
The
secondary
outcome
was
identification
methodologic
key
parameters,
adverse
effects,
sensations.Studies
in
healthy
adults
who
were
receiving
active
sham
intervention
or
any
differential
condition
included.
main
assessed
increase/decrease
power
through
either
electroencephalography
magnetoencephalography.
Secondary
outcomes
sensation
reporting,
effects.
Risks
bias
study
quality
with
Cochrane
assessment
tool.We
obtained
1429
references,
20
met
selection
criteria.
A
statistically
significant
alpha-power
increase
observed
nine
studies
using
continuous
two
intermittent
set
at
frequency
within
range.
three
more
outside
Heterogeneity
among
parameters
recognized.
Reported
effects
mild.
implementation
double
blind
identified
as
challenging
tACS,
part
owing
electrical
artifacts
generated
by
on
recorded
signal.Most
reported
that
optimization
noninvasive
method
is
interest
mostly
its
clinical
applications
neurological
conditions
associated
perturbations
activity.
However,
research
efforts
are
needed
standardize
optimal
achieve
lasting
modulation
develop
alternatives
reduce
experimental
bias,
improve
Scientific Reports,
Journal Year:
2024,
Volume and Issue:
14(1)
Published: Feb. 28, 2024
Abstract
Haptic
hearing
aids,
which
provide
speech
information
through
tactile
stimulation,
could
substantially
improve
outcomes
for
both
cochlear
implant
users
and
those
unable
to
access
implants.
Recent
advances
in
wide-band
haptic
actuator
technology
have
made
new
audio-to-tactile
conversion
strategies
viable
wearable
devices.
One
such
strategy
filters
the
audio
into
eight
frequency
bands,
are
evenly
distributed
across
range.
The
amplitude
envelopes
from
bands
modulate
amplitudes
of
low-frequency
tones,
delivered
vibration
a
single
site
on
wrist.
This
vocoder
effectively
transfers
some
phonemic
information,
but
vowels
obstruent
consonants
poorly
portrayed.
In
20
participants
with
normal
touch
perception,
we
tested
(1)
whether
focusing
more
densely
around
first
second
formant
frequencies
improved
vowel
discrimination,
(2)
at
mid-to-high
consonant
discrimination.
obstruent-focused
approach
was
found
be
ineffective.
However,
formant-focused
discrimination
by
8%,
without
changing
overall
strategy,
can
readily
implemented
real
time
compact
device,
perception
aid
users.
European Journal of Neuroscience,
Journal Year:
2025,
Volume and Issue:
61(9)
Published: April 28, 2025
ABSTRACT
Plasticity
from
auditory
experience
shapes
the
brain's
encoding
and
perception
of
sound.
Though
stronger
neural
entrainment
(i.e.,
brain‐to‐acoustic
synchronization)
aids
speech
perception,
underlying
oscillatory
activity
may
uniquely
interact
with
long‐term
experiences
music
training)
short‐term
plasticity
during
concurrent
perception.
Here,
we
explored
rapid
perceptual
learning
sounds
in
normal‐hearing
young
adults
who
differed
their
amount
self‐reported
training
(defined
as
“musicians”
“nonmusicians”).
Participants
learned
to
identify
double‐vowel
mixtures
~45
min
sessions
high‐density
EEG
recordings.
We
analyzed
alpha‐band
power
(7–12
Hz)
following
a
rhythmic
speech‐stimulus
train
(~9
preceding
behavioral
identification
determine
whether
increased
(brain‐to‐speech
entrainment)
or
decreased
alpha
(alpha‐band
suppression)
corresponded
task
success.
Source
directed
functional
connectivity
analyses
data
probed
behavior
was
driven
by
group
differences
auditory‐motor
coupling.
Both
groups
improved
training.
Listeners'
prior
target
predicted
performance;
surprisingly,
oscillations
were
observed
incorrect
compared
correct
trial
responses.
also
found
stark
hemispheric
biases
coupling,
greater
right
left
hemisphere
for
musicians
(R
>
L)
but
not
nonmusicians
=
L).
Stronger
responses
supports
notion
that
(~10
suppression
is
an
important
modulator
trial‐by‐trial
success
processing.
Our
findings
suggest
impact
bioRxiv (Cold Spring Harbor Laboratory),
Journal Year:
2024,
Volume and Issue:
unknown
Published: Aug. 19, 2024
Abstract
Since
childhood,
we
experience
speech
as
a
combination
of
audio
and
visual
signals,
with
cues
particularly
beneficial
in
difficult
auditory
conditions.
This
study
investigates
an
alternative
multisensory
context
speech,
namely
audio-tactile,
which
could
prove
for
rehabilitation
the
hearing
impaired
population.
We
show
improved
understanding
distorted
background
noise,
when
combined
low-frequency
speech-extracted
vibrotactile
stimulation
delivered
on
fingertips.
The
quick
effect
might
be
related
to
fact
that
both
tactile
signals
contain
same
type
information.
Changes
functional
connectivity
due
audio-tactile
training
are
primarily
observed
system,
including
early
regions,
lateral
occipital
cortex,
middle
temporal
motion
area,
extrastriate
body
area.
These
effects,
despite
lack
input
during
task,
possibly
reflect
automatic
involvement
areas
supporting
lip-reading
spatial
aspects
language,
such
gesture
observation,
acoustic
For
integration
increased
sensorimotor
hub
representing
entire
body,
parietal
system
motor
planning
based
inputs,
along
several
areas.
After
training,
increases
high-order
language-related
frontal
regions.
Overall,
results
suggest
new
task
activates
regions
partially
overlap
established
brain
network
audio-visual
processing.
further
indicates
neuronal
plasticity
perceptual
learning
is
first
built
upon
existing
structural
blueprint
connectivity.
Further
effects
task-specific
behaviour
perception,
well
signal
Possibly,
longer
regime
required
strengthen
direct
pathways
between
Scientific Reports,
Journal Year:
2023,
Volume and Issue:
13(1)
Published: Dec. 19, 2023
Vibrotactile
stimulation
is
believed
to
enhance
auditory
speech
perception,
offering
potential
benefits
for
cochlear
implant
(CI)
users
who
may
utilize
compensatory
sensory
strategies.
Our
study
advances
previous
research
by
directly
comparing
tactile
intelligibility
enhancements
in
normal-hearing
(NH)
and
CI
participants,
using
the
same
paradigm.
Moreover,
we
assessed
enhancement
considering
stimulus
non-specific,
excitatory
effects
through
an
incongruent
audio-tactile
control
condition
that
did
not
contain
any
speech-relevant
information.
In
addition
this
condition,
presented
sentences
only
a
congruent
with
providing
low-frequency
envelope
information
via
vibrating
probe
on
index
fingertip.
The
involved
23
NH
listeners
14
users.
both
groups,
significant
were
observed
stimuli
(5.3%
5.4%
participants),
but
stimulation.
These
findings
replicate
previously
effects.
Juxtaposing
our
research,
informational
content
of
emerges
as
modulator
intelligibility:
Generally,
enhanced,
non-matching
reduced,
neutral
change
test
outcomes.
We
conclude
temporal
cues
provided
vibrotactile
aid
parsing
continuous
signals
into
syllables
words,
consequently
leading
improvements
intelligibility.
NeuroImage,
Journal Year:
2023,
Volume and Issue:
274, P. 120140 - 120140
Published: April 28, 2023
Auditory
perception
can
benefit
from
stimuli
in
non-auditory
sensory
modalities,
as
for
example
lip-reading.
Compared
with
such
visual
influences,
tactile
influences
are
still
poorly
understood.
It
has
been
shown
that
single
pulses
enhance
the
of
auditory
depending
on
their
relative
timing,
but
whether
and
how
brief
enhancements
be
stretched
time
more
sustained,
phase-specific
periodic
stimulation
is
unclear.
To
address
this
question,
we
presented
fluctuated
coherently
continuously
at
4
Hz
an
noise
(either
in-phase
or
anti-phase)
assessed
its
effect
cortical
processing
signal
embedded
noise.
Scalp-electroencephalography
recordings
revealed
enhancing
responses
phase-locked
to
a
suppressive
anti-phase
evoked
by
signal.
Although
these
effects
appeared
follow
well-known
principles
multisensory
integration
discrete
audio-tactile
events,
they
were
not
accompanied
corresponding
behavioral
measures
perception.
Our
results
indicate
continuous
acoustically-induced
fluctuations
mask
ongoing
They
further
suggest
sustained
insufficient
inducing
bottom-up
benefits.
Scientific Reports,
Journal Year:
2023,
Volume and Issue:
13(1)
Published: Oct. 3, 2023
Speech
understanding,
while
effortless
in
quiet
conditions,
is
challenging
noisy
environments.
Previous
studies
have
revealed
that
a
feasible
approach
to
supplement
speech-in-noise
(SiN)
perception
consists
presenting
speech-derived
signals
as
haptic
input.
In
the
current
study,
we
investigated
whether
presentation
of
vibrotactile
signal
derived
from
speech
temporal
envelope
can
improve
SiN
intelligibility
multi-talker
background
for
untrained,
normal-hearing
listeners.
We
also
determined
if
sensitivity,
evaluated
using
detection
thresholds,
modulates
extent
audio-tactile
improvement.
practice,
measured
participants'
recognition
noise
without
(audio-only)
and
with
(audio-tactile)
concurrent
stimulation
delivered
three
schemes:
left
or
right
palm,
both.
Averaged
across
delivery
schemes,
led
significant
improvement
0.41
dB
when
compared
audio-only
condition.
Notably,
there
were
no
differences
observed
between
improvements
these
schemes.
addition,
benefit
was
significantly
predicted
by
threshold
levels
unimodal
performance.
The
afforded
speech-envelope-derived
line
previously
uncovered
enhancements
untrained
listeners
known
hearing
impairment.
Overall,
results
highlight
potential
recognition,
especially
individuals
poor
abilities,
tentatively
more
so
increasing
tactile
sensitivity.
Moreover,
they
lend
support
multimodal
accounts
research
on
aid
devices.
Ear and Hearing,
Journal Year:
2024,
Volume and Issue:
46(1), P. 184 - 195
Published: July 24, 2024
Objectives:
Identifying
target
sounds
in
challenging
environments
is
crucial
for
daily
experiences.
It
important
to
note
that
it
can
be
enhanced
by
nonauditory
stimuli,
example,
through
lip-reading
an
ongoing
conversation.
However,
how
tactile
stimuli
affect
auditory
processing
still
relatively
unclear.
Recent
studies
have
shown
brief
reliably
facilitate
perception,
while
using
longer-lasting
audio-tactile
stimulation
yielded
conflicting
results.
This
study
aimed
investigate
the
impact
of
pulsating
on
basic
processing.
Design:
In
experiment
1,
electroencephalogram
(EEG)
was
recorded
24
participants
performed
a
loudness-discrimination
task
4-Hz
modulated
tone-in-noise
and
received
either
in-phase,
anti-phase,
or
no
electrotactile
above
median
nerve.
2,
another
were
presented
with
same
as
before,
but
detection
their
selective
attention
manipulated.
Results:
We
found
in-phase
EEG
responses
tone,
whereas
anti-phase
suppressed
these
responses.
No
corresponding
effects
performance
observed
1.
Using
yes/no
paradigm
we
stimulation,
not
improved
thresholds.
Selective
also
thresholds
did
modulate
benefit
from
stimulation.
Conclusions:
Our
highlights
input
enhance
reflected
scalp
might
implications
development
hearing
enhancement
technologies
interventions.
Neurons Behavior Data analysis and Theory,
Journal Year:
2024,
Volume and Issue:
unknown
Published: Oct. 16, 2024
Neurophysiology
research
has
demonstrated
that
it
is
possible
and
valuable
to
investigate
sensory
processing
in
scenarios
involving
continuous
streams,
such
as
speech
music.
Over
the
past
10
years
or
so,
novel
analytic
frameworks
combined
with
growing
participation
data
sharing
led
a
surge
of
publicly
available
datasets
experiments.
However,
open
science
efforts
this
domain
remain
scattered,
lacking
cohesive
set
guidelines.
This
paper
presents
an
end-to-end
framework
for
storage,
analysis,
sharing,
re-analysis
neural
recorded
during
We
propose
structure
builds
on
existing
custom
structures
(Continuous-event
Neural
Data
CND),
providing
precise
naming
conventions
types,
well
workflow
storing
loading
general-purpose
BIDS
structure.
The
been
designed
interface
EEG/MEG
analysis
toolboxes,
Eelbrain,
NAPLib,
MNE,
mTRF-Toolbox.
present
guidelines
by
taking
both
user
view
(rapidly
re-analyse
data)
experimenter
(store,
analyse,
share),
making
process
straightforward
accessible.
Additionally,
we
introduce
web-based
browser
enables
effortless
replication
published
results
re-analysis.
The Journal of the Acoustical Society of America,
Journal Year:
2023,
Volume and Issue:
153(5), P. 3130 - 3130
Published: May 1, 2023
Seeing
a
speaker's
face
can
help
substantially
with
understanding
their
speech,
particularly
in
challenging
listening
conditions.
Research
into
the
neurobiological
mechanisms
behind
audiovisual
integration
has
recently
begun
to
employ
continuous
natural
speech.
However,
these
efforts
are
impeded
by
lack
of
high-quality
recordings
speaker
narrating
longer
text.
Here,
we
seek
close
this
gap
developing
AVbook,
an
speech
corpus
designed
for
cognitive
neuroscience
studies
and
recognition.
The
consists
3.6
h
two
speakers,
one
male
female,
each
reading
59
passages
from
narrative
English
were
acquired
at
high
frame
rate
119.88
frames/s.
includes
phone-level
alignment
files
set
multiple-choice
questions
test
attention
different
passages.
We
verified
efficacy
pilot
study.
A
short
written
summary
is
also
provided
recording.
To
enable
synchronization
when
presenting
stimuli,
four
videos
electronic
clapperboard
recorded
corpus.
publicly
available
support
research
neurobiology
processing
as
well
development
computer
algorithms