American Journal of Neurodegenerative Disease,
Journal Year:
2024,
Volume and Issue:
13(5), P. 49 - 69
Published: Jan. 1, 2024
This
study
aims
to
explore
the
capabilities
of
dendritic
learning
within
feedforward
tree
networks
(FFTN)
in
comparison
traditional
synaptic
plasticity
models,
particularly
context
digit
recognition
tasks
using
MNIST
dataset.
We
employed
FFTNs
with
nonlinear
segment
amplification
and
Hebbian
rules
enhance
computational
efficiency.
The
dataset,
consisting
70,000
images
handwritten
digits,
was
used
for
training
testing.
Key
performance
metrics,
including
accuracy,
precision,
recall,
F1-score,
were
analysed.
models
significantly
outperformed
plasticity-based
across
all
metrics.
Specifically,
framework
achieved
a
test
accuracy
91%,
compared
88%
demonstrating
superior
classification.
Dendritic
offers
more
powerful
by
closely
mimicking
biological
neural
processes,
providing
enhanced
efficiency
scalability.
These
findings
have
important
implications
advancing
both
artificial
intelligence
systems
neuroscience.
Nature Communications,
Journal Year:
2025,
Volume and Issue:
16(1)
Published: Jan. 22, 2025
Artificial
neural
networks
(ANNs)
are
at
the
core
of
most
Deep
Learning
(DL)
algorithms
that
successfully
tackle
complex
problems
like
image
recognition,
autonomous
driving,
and
natural
language
processing.
However,
unlike
biological
brains
who
similar
in
a
very
efficient
manner,
DL
require
large
number
trainable
parameters,
making
them
energy-intensive
prone
to
overfitting.
Here,
we
show
new
ANN
architecture
incorporates
structured
connectivity
restricted
sampling
properties
dendrites
counteracts
these
limitations.
We
find
dendritic
ANNs
more
robust
overfitting
match
or
outperform
traditional
on
several
classification
tasks
while
using
significantly
fewer
parameters.
These
advantages
likely
result
different
learning
strategy,
whereby
nodes
respond
multiple
classes,
classical
strive
for
class-specificity.
Our
findings
suggest
incorporation
can
make
precise,
resilient,
parameter-efficient
shed
light
how
features
impact
strategies
ANNs.
Nature Communications,
Journal Year:
2025,
Volume and Issue:
16(1)
Published: Jan. 2, 2025
Biological
neural
circuits
demonstrate
exceptional
adaptability
to
diverse
tasks
by
dynamically
adjusting
connections
efficiently
process
information.
However,
current
two-dimension
materials-based
neuromorphic
hardware
mainly
focuses
on
specific
devices
individually
mimic
artificial
synapse
or
heterosynapse
soma
and
encoding
the
inner
states
realize
corresponding
mock
object
function.
Recent
advancements
suggest
that
integrating
multiple
material
brain-like
functions
including
inter-mutual
connecting
assembly
engineering
has
become
a
new
research
trend.
In
this
work,
we
MoS2-based
reconfigurable
analog
emulate
synaptic,
heterosynaptic,
somatic
functionalities.
The
inner-states
inter-connections
of
all
modules
co-encode
versatile
such
as
analog-to-digital/digital-to-analog
conversion,
linear/nonlinear
computations
integration,
vector-matrix
multiplication,
convolution,
name
few.
By
assembling
fit
with
different
environment-interactive
demanding
tasks,
experimentally
achieves
reconstruction
image
sharpening
medical
images
for
diagnosis
well
circuit-level
imitation
attention-switching
visual
residual
mechanisms
smart
perception.
This
innovative
promotes
development
future
general-purpose
computing
machines
high
flexibility
tasks.
study
introduces
integrates
It
adapts
like
enhancement
perception,
advancing
flexible,
solutions.
How
external/internal
‘state’
is
represented
in
the
brain
crucial,
since
appropriate
representation
enables
goal-directed
behavior.
Recent
studies
suggest
that
state
and
value
can
be
simultaneously
learnt
through
reinforcement
learning
(RL)
using
reward-prediction-error
recurrent-neural-network
(RNN)
its
downstream
weights.
However,
how
such
neurally
implemented
remains
unclear
because
training
of
RNN
‘backpropagation’
method
requires
weights,
which
are
biologically
unavailable
at
upstream
RNN.
Here
we
show
random
feedback
instead
weights
still
works
‘feedback
alignment’,
was
originally
demonstrated
for
supervised
learning.
We
further
if
constrained
to
non-negative,
occurs
without
alignment
non-negative
constraint
ensures
loose
alignment.
These
results
neural
mechanisms
RL
representation/value
power
biological
constraints.
How
external/internal
‘state’
is
represented
in
the
brain
crucial,
since
appropriate
representation
enables
goal-directed
behavior.
Recent
studies
suggest
that
state
and
value
can
be
simultaneously
learnt
through
reinforcement
learning
(RL)
using
reward-prediction-error
recurrent-neural-network
(RNN)
its
downstream
weights.
However,
how
such
neurally
implemented
remains
unclear
because
training
of
RNN
‘backpropagation’
method
requires
weights,
which
are
biologically
unavailable
at
upstream
RNN.
Here
we
show
random
feedback
instead
weights
still
works
‘feedback
alignment’,
was
originally
demonstrated
for
supervised
learning.
We
further
if
constrained
to
non-negative,
occurs
without
alignment
non-negative
constraint
ensures
loose
alignment.
These
results
neural
mechanisms
RL
representation/value
power
biological
constraints.
bioRxiv (Cold Spring Harbor Laboratory),
Journal Year:
2024,
Volume and Issue:
unknown
Published: Sept. 17, 2024
Abstract
The
groundbreaking
work
of
Hubel
and
Wiesel
has
been
instrumental
in
shaping
our
understanding
V1,
leading
to
modeling
neural
responses
as
cascades
linear
nonlinear
processes
what
come
be
known
the
“standard
model”
vision.
Under
this
formulation,
however,
some
dendritic
properties
cannot
represented
a
practical
manner,
while
extensive
evidence
indicates
that
are
an
indispensable
element
key
behaviours.
As
result,
current
V1
models
fail
explain
number
scenarios.
In
work,
we
propose
implicit
model
for
considers
integration
backpropagation
action
potentials
from
soma
dendrites.
This
is
parsimonious
scheme
minimizes
energy,
allows
better
conceptual
processes,
explains
several
neurophysiological
phenomena
have
challenged
classical
approaches.
Journal of Neurophysiology,
Journal Year:
2023,
Volume and Issue:
130(4), P. 910 - 924
Published: Aug. 23, 2023
Rhythmic
activity
is
ubiquitous
in
neural
systems,
with
theta-resonant
pyramidal
neurons
integrating
rhythmic
inputs
many
cortical
structures.
Impedance
analysis
has
been
widely
used
to
examine
frequency-dependent
responses
of
neuronal
membranes
inputs,
but
it
assumes
that
the
membrane
a
linear
system,
requiring
use
small
signals
stay
near-linear
regime.
However,
postsynaptic
potentials
are
often
large
and
trigger
nonlinear
mechanisms
(voltage-gated
ion
channels).
The
goals
this
work
were