Scientific Reports,
Journal Year:
2023,
Volume and Issue:
13(1)
Published: Aug. 9, 2023
Abstract
The
statistical
structure
of
the
environment
is
often
important
when
making
decisions.
There
are
multiple
theories
how
brain
represents
structure.
One
such
theory
states
that
neural
activity
spontaneously
samples
from
probability
distributions.
In
other
words,
network
spends
more
time
in
which
encode
high-probability
stimuli.
Starting
assembly,
increasingly
thought
to
be
building
block
for
computation
brain,
we
focus
on
arbitrary
prior
knowledge
about
external
world
can
both
learned
and
recollected.
We
present
a
model
based
upon
learning
inverse
cumulative
distribution
function.
Learning
entirely
unsupervised
using
biophysical
neurons
biologically
plausible
rules.
show
this
then
accessed
compute
expectations
signal
surprise
downstream
networks.
Sensory
history
effects
emerge
as
consequence
ongoing
learning.
Frontiers in Computational Neuroscience,
Journal Year:
2023,
Volume and Issue:
17
Published: June 28, 2023
Although
it
may
appear
infeasible
and
impractical,
building
artificial
intelligence
(AI)
using
a
bottom-up
approach
based
on
the
understanding
of
neuroscience
is
straightforward.
The
lack
generalized
governing
principle
for
biological
neural
networks
(BNNs)
forces
us
to
address
this
problem
by
converting
piecemeal
information
diverse
features
neurons,
synapses,
circuits
into
AI.
In
review,
we
described
recent
attempts
build
biologically
plausible
network
following
neuroscientifically
similar
strategies
optimization
or
implanting
outcome
optimization,
such
as
properties
single
computational
units
characteristics
architecture.
addition,
proposed
formalism
relationship
between
set
objectives
that
attempt
achieve,
classes
categorized
how
closely
their
architectural
resemble
those
BNN.
This
expected
define
potential
roles
top-down
approaches
offer
map
helping
navigation
gap
AI
engineering.
bioRxiv (Cold Spring Harbor Laboratory),
Journal Year:
2023,
Volume and Issue:
unknown
Published: May 26, 2023
A
tool
to
map
changes
in
synaptic
strength
during
a
defined
time
window
could
provide
powerful
insights
into
the
mechanisms
governing
learning
and
memory.
We
developed
technique,
Extracellular
Protein
Surface
Labeling
Neurons
(EPSILON),
α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic
acid
receptor
(AMPAR)
insertion
vivo
by
pulse-chase
labeling
of
surface
AMPARs
with
membrane-impermeable
dyes.
This
approach
allows
for
single-synapse
resolution
maps
plasticity
genetically
targeted
neurons
memory
formation.
investigated
relationship
between
synapse-level
cell-level
encodings
mapping
cFos
expression
hippocampal
CA1
pyramidal
cells
upon
contextual
fear
conditioning
(CFC).
observed
strong
correlation
expression,
suggesting
mechanism
association
engrams.
The
EPSILON
technique
is
useful
may
be
extended
investigate
trafficking
other
transmembrane
proteins.
Scientific Reports,
Journal Year:
2023,
Volume and Issue:
13(1)
Published: Aug. 9, 2023
Abstract
The
statistical
structure
of
the
environment
is
often
important
when
making
decisions.
There
are
multiple
theories
how
brain
represents
structure.
One
such
theory
states
that
neural
activity
spontaneously
samples
from
probability
distributions.
In
other
words,
network
spends
more
time
in
which
encode
high-probability
stimuli.
Starting
assembly,
increasingly
thought
to
be
building
block
for
computation
brain,
we
focus
on
arbitrary
prior
knowledge
about
external
world
can
both
learned
and
recollected.
We
present
a
model
based
upon
learning
inverse
cumulative
distribution
function.
Learning
entirely
unsupervised
using
biophysical
neurons
biologically
plausible
rules.
show
this
then
accessed
compute
expectations
signal
surprise
downstream
networks.
Sensory
history
effects
emerge
as
consequence
ongoing
learning.