Chaos An Interdisciplinary Journal of Nonlinear Science,
Journal Year:
2025,
Volume and Issue:
35(3)
Published: March 1, 2025
In
networked
systems,
the
interplay
between
dynamics
of
individual
subsystems
and
their
network
interactions
has
been
found
to
generate
multistability
in
various
contexts.
Despite
its
ubiquity,
specific
mechanisms
ingredients
that
give
rise
from
such
remain
poorly
understood.
a
coupled
excitable
units,
we
demonstrate
this
generating
occurs
through
competition
units’
transient
coupling.
Specifically,
diffusive
coupling
units
reinjects
them
into
excitability
region
state
space,
effectively
trapping
there.
We
show
mechanism
leads
coexistence
multiple
types
oscillations:
periodic,
quasi-periodic,
even
chaotic,
although
separately
do
not
oscillate.
Interestingly,
find
attractors
emerge
different
bifurcations—in
particular,
periodic
either
saddle–node
limit
cycles
bifurcations
or
homoclinic
bifurcations—but
all
cases,
reinjection
is
present.
Nature Communications,
Journal Year:
2023,
Volume and Issue:
14(1)
Published: Feb. 23, 2023
Abstract
Humans
and
other
animals
demonstrate
a
remarkable
ability
to
generalize
knowledge
across
distinct
contexts
objects
during
natural
behavior.
We
posit
that
this
arises
from
specific
representational
geometry,
we
call
abstract
is
referred
as
disentangled
in
machine
learning.
These
representations
have
been
observed
recent
neurophysiological
studies.
However,
it
unknown
how
they
emerge.
Here,
using
feedforward
neural
networks,
the
learning
of
multiple
tasks
causes
emerge,
both
supervised
reinforcement
show
these
enable
few-sample
reliable
generalization
on
novel
tasks.
conclude
sensory
cognitive
variables
may
emerge
behaviors
exhibit
world,
and,
consequence,
could
be
pervasive
high-level
brain
regions.
also
make
several
predictions
about
which
will
represented
abstractly.
Nature Neuroscience,
Journal Year:
2024,
Volume and Issue:
27(5), P. 988 - 999
Published: March 18, 2024
Abstract
A
fundamental
human
cognitive
feat
is
to
interpret
linguistic
instructions
in
order
perform
novel
tasks
without
explicit
task
experience.
Yet,
the
neural
computations
that
might
be
used
accomplish
this
remain
poorly
understood.
We
use
advances
natural
language
processing
create
a
model
of
generalization
based
on
instructions.
Models
are
trained
set
common
psychophysical
tasks,
and
receive
embedded
by
pretrained
model.
Our
best
models
can
previously
unseen
with
an
average
performance
83%
correct
solely
(that
is,
zero-shot
learning).
found
scaffolds
sensorimotor
representations
such
activity
for
interrelated
shares
geometry
semantic
instructions,
allowing
cue
proper
composition
practiced
skills
settings.
show
how
generates
description
it
has
identified
using
only
motor
feedback,
which
subsequently
guide
partner
task.
offer
several
experimentally
testable
predictions
outlining
information
must
represented
facilitate
flexible
general
cognition
brain.
Nature Neuroscience,
Journal Year:
2025,
Volume and Issue:
unknown
Published: Jan. 17, 2025
The
manner
in
which
neural
activity
unfolds
over
time
is
thought
to
be
central
sensory,
motor
and
cognitive
functions
the
brain.
Network
models
have
long
posited
that
brain's
computations
involve
courses
of
are
shaped
by
underlying
network.
A
prediction
from
this
view
should
difficult
violate.
We
leveraged
a
brain-computer
interface
challenge
monkeys
violate
naturally
occurring
population
we
observed
cortex.
This
included
challenging
animals
traverse
natural
course
time-reversed
manner.
Animals
were
unable
when
directly
challenged
do
so.
These
results
provide
empirical
support
for
brain
indeed
reflect
network-level
computational
mechanisms
they
believed
implement.
Nature Neuroscience,
Journal Year:
2025,
Volume and Issue:
unknown
Published: Feb. 10, 2025
Higher
cortical
areas
carry
a
wide
range
of
sensory,
cognitive
and
motor
signals
mixed
in
heterogeneous
responses
single
neurons
tuned
to
multiple
task
variables.
Dimensionality
reduction
methods
that
rely
on
correlations
between
neural
activity
variables
leave
unknown
how
arise
from
connectivity
drive
behavior.
We
develop
the
latent
circuit
model,
dimensionality
approach
which
interact
via
low-dimensional
recurrent
produce
behavioral
output.
apply
inference
networks
trained
perform
context-dependent
decision-making
find
suppression
mechanism
contextual
representations
inhibit
irrelevant
sensory
responses.
validate
this
by
confirming
effects
patterned
perturbations
predicted
model.
similar
prefrontal
cortex
monkeys
performing
same
task.
show
incorporating
causal
interactions
among
is
critical
for
identifying
behaviorally
relevant
computations
response
data.
PLoS Computational Biology,
Journal Year:
2024,
Volume and Issue:
20(2), P. e1011852 - e1011852
Published: Feb. 5, 2024
Neural
oscillations
are
ubiquitously
observed
in
many
brain
areas.
One
proposed
functional
role
of
these
is
that
they
serve
as
an
internal
clock,
or
'frame
reference'.
Information
can
be
encoded
by
the
timing
neural
activity
relative
to
phase
such
oscillations.
In
line
with
this
hypothesis,
there
have
been
multiple
empirical
observations
codes
brain.
Here
we
ask:
What
kind
dynamics
support
coding
information
oscillations?
We
tackled
question
analyzing
recurrent
networks
(RNNs)
were
trained
on
a
working
memory
task.
The
given
access
external
reference
oscillation
and
tasked
produce
oscillation,
difference
between
output
maintains
identity
transient
stimuli.
found
converged
stable
oscillatory
dynamics.
Reverse
engineering
revealed
each
phase-coded
corresponds
separate
limit
cycle
attractor.
characterized
how
stability
attractor
depends
both
amplitude
frequency,
properties
experimentally
observed.
To
understand
connectivity
structures
underlie
dynamics,
showed
described
two
phase-coupled
oscillators.
Using
insight,
condensed
our
reduced
model
consisting
modules:
generates
one
implements
coupling
function
reference.
summary,
reverse
RNNs,
propose
mechanism
which
harness
for
memory.
Specifically,
phase-coding
network
autonomous
it
couples
multi-stable
fashion.
bioRxiv (Cold Spring Harbor Laboratory),
Journal Year:
2024,
Volume and Issue:
unknown
Published: March 6, 2024
Abstract
Humans
and
animals
have
an
impressive
ability
to
juggle
multiple
tasks
in
a
constantly
changing
environment.
This
flexibility,
however,
leads
decreased
performance
under
uncertain
task
conditions.
Here,
we
combined
monkey
electrophysiology,
human
psychophysics,
artificial
neural
network
modeling
investigate
the
neuronal
mechanisms
of
this
cost.
We
developed
behavioural
paradigm
measure
influence
participants’
decision-making
perception
two
distinct
perceptual
tasks.
Our
data
revealed
that
both
humans
monkeys,
unlike
trained
for
same
tasks,
make
less
accurate
decisions
when
is
uncertain.
generated
mechanistic
hypothesis
by
comparing
produce
correct
choices
with
another
replicate
choices.
hypothesized,
confirmed
further
behavioural,
physiological,
causal
experiments,
cost
flexibility
comes
from
what
term
interference.
Under
conditions,
interference
between
different
causes
errors
because
it
results
stronger
representation
irrelevant
features
entangled
representations
features.
suggest
tantalizing,
general
hypothesis:
cognitive
capacity
limitations,
health
disease,
stem
stimuli,
or
memories.
Journal of Neuroscience,
Journal Year:
2022,
Volume and Issue:
42(45), P. 8514 - 8523
Published: Nov. 9, 2022
Biological
neural
networks
adapt
and
learn
in
diverse
behavioral
contexts.
Artificial
(ANNs)
have
exploited
biological
properties
to
solve
complex
problems.
However,
despite
their
effectiveness
for
specific
tasks,
ANNs
are
yet
realize
the
flexibility
adaptability
of
cognition.
This
review
highlights
recent
advances
computational
experimental
research
advance
our
understanding
artificial
intelligence.
In
particular,
we
discuss
critical
mechanisms
from
cellular,
systems,
cognitive
neuroscience
fields
that
contributed
refining
architecture
training
algorithms
ANNs.
Additionally,
how
work
used
understand
neuronal
correlates
cognition
process
high
throughput
data.