Inferring neural activity before plasticity as a foundation for learning beyond backpropagation DOI Creative Commons
Yuhang Song, Beren Millidge, Tommaso Salvatori

et al.

Nature Neuroscience, Journal Year: 2024, Volume and Issue: 27(2), P. 348 - 358

Published: Jan. 3, 2024

Abstract For both humans and machines, the essence of learning is to pinpoint which components in its information processing pipeline are responsible for an error output, a challenge that known as ‘credit assignment’. It has long been assumed credit assignment best solved by backpropagation, also foundation modern machine learning. Here, we set out fundamentally different principle on called ‘prospective configuration’. In prospective configuration, network first infers pattern neural activity should result from learning, then synaptic weights modified consolidate change activity. We demonstrate this distinct mechanism, contrast (1) underlies well-established family models cortical circuits, (2) enables more efficient effective many contexts faced biological organisms (3) reproduces surprising patterns behavior observed diverse human rat experiments.

Language: Английский

How to Represent Part-Whole Hierarchies in a Neural Network DOI Open Access

Geoffrey E. Hinton

Neural Computation, Journal Year: 2022, Volume and Issue: 35(3), P. 413 - 452

Published: Dec. 22, 2022

This article does not describe a working system. Instead, it presents single idea about representation that allows advances made by several different groups to be combined into an imaginary system called GLOM.1 The include transformers, neural fields, contrastive learning, distillation, and capsules. GLOM answers the question: How can network with fixed architecture parse image part-whole hierarchy has structure for each image? is simply use islands of identical vectors represent nodes in tree. If work, should significantly improve interpretability representations produced transformer-like systems when applied vision or language.

Language: Английский

Citations

118

Biological constraints on neural network models of cognitive function DOI
Friedemann Pulvermüller, Rosario Tomasello, Malte R. Henningsen‐Schomers

et al.

Nature reviews. Neuroscience, Journal Year: 2021, Volume and Issue: 22(8), P. 488 - 502

Published: June 28, 2021

Language: Английский

Citations

116

Bidirectional synaptic plasticity rapidly modifies hippocampal representations DOI Creative Commons
Aaron D. Milstein, Yiding Li, Katie C. Bittner

et al.

eLife, Journal Year: 2021, Volume and Issue: 10

Published: Dec. 9, 2021

Learning requires neural adaptations thought to be mediated by activity-dependent synaptic plasticity. A relatively non-standard form of plasticity driven dendritic calcium spikes, or plateau potentials, has been reported underlie place field formation in rodent hippocampal CA1 neurons. Here, we found that this behavioral timescale (BTSP) can also reshape existing fields via bidirectional weight changes depend on the temporal proximity potentials pre-existing fields. When evoked near an field, induced less potentiation and more depression, suggesting BTSP might inversely postsynaptic activation. However, manipulations cell membrane potential computational modeling indicated anti-correlation actually results from a dependence current such weak inputs potentiate strong depress. network model implementing learning rule suggested enables population activity, rather than pairwise neuronal correlations, drive experience.

Language: Английский

Citations

115

Entorhinal cortex directs learning-related changes in CA1 representations DOI Creative Commons
Christine Grienberger, Jeffrey C. Magee

Nature, Journal Year: 2022, Volume and Issue: 611(7936), P. 554 - 562

Published: Nov. 2, 2022

Abstract Learning-related changes in brain activity are thought to underlie adaptive behaviours 1,2 . For instance, the learning of a reward site by rodents requires development an over-representation that location hippocampus 3–6 How this learning-related change occurs remains unknown. Here we recorded hippocampal CA1 population as mice learned on linear treadmill. Physiological and pharmacological evidence suggests required behavioural timescale synaptic plasticity (BTSP) 7 BTSP is known be driven dendritic voltage signals proposed were initiated input from entorhinal cortex layer 3 (EC3). Accordingly, was largely removed optogenetic inhibition EC3 activity. Recordings neurons revealed pattern could provide instructive signal directing generate over-representation. Consistent with function, our observations show exposure second environment possessing prominent reward-predictive cue resulted both place field density more elevated at than reward. These data indicate produced directed seems specifically adapted behaviourally relevant features environment.

Language: Английский

Citations

104

Feedforward and feedback interactions between visual cortical areas use different population activity patterns DOI Creative Commons
João D. Semedo,

Anna I. Jasper,

Amin Zandvakili

et al.

Nature Communications, Journal Year: 2022, Volume and Issue: 13(1)

Published: March 1, 2022

Brain function relies on the coordination of activity across multiple, recurrently connected brain areas. For instance, sensory information encoded in early areas is relayed to, and further processed by, higher cortical then fed back. However, way which feedforward feedback signaling interact with one another incompletely understood. Here we investigate this question by leveraging simultaneous neuronal population recordings midlevel visual (V1-V2 V1-V4). Using a dimensionality reduction approach, find that interactions are feedforward-dominated shortly after stimulus onset feedback-dominated during spontaneous activity. The patterns most correlated were distinct feedforward- periods. These results suggest rely separate "channels", allows signals to not directly affect forward.

Language: Английский

Citations

98

Metaplastic and energy-efficient biocompatible graphene artificial synaptic transistors for enhanced accuracy neuromorphic computing DOI Creative Commons
Dmitry Kireev, Samuel Liu, Harrison Jin

et al.

Nature Communications, Journal Year: 2022, Volume and Issue: 13(1)

Published: July 28, 2022

CMOS-based computing systems that employ the von Neumann architecture are relatively limited when it comes to parallel data storage and processing. In contrast, human brain is a living computational signal processing unit operates with extreme parallelism energy efficiency. Although numerous neuromorphic electronic devices have emerged in last decade, most of them rigid or contain materials toxic biological systems. this work, we report on biocompatible bilayer graphene-based artificial synaptic transistors (BLAST) capable mimicking behavior. The BLAST leverage dry ion-selective membrane, enabling long-term potentiation, ~50 aJ/µm

Language: Английский

Citations

77

The Computational Theory of Mind DOI

Matteo Colombo,

Gualtiero Piccinini

Published: Nov. 13, 2023

The Computational Theory of Mind says that the mind is a computing system. It has long history going back to idea thought kind computation. Its modern incarnation relies on analogies with contemporary technology and use computational models. comes in many versions, some more plausible than others. This Element supports theory primarily by its contribution solving mind-body problem, ability explain mental phenomena, success modelling artificial intelligence. To be turned into an adequate theory, it needs made compatible tractability cognition, situatedness dynamical aspects mind, way brain works, intentionality, consciousness.

Language: Английский

Citations

76

Neurons learn by predicting future activity DOI Creative Commons
Artur Luczak, Bruce L. McNaughton,

Yoshimasa Kubo

et al.

Nature Machine Intelligence, Journal Year: 2022, Volume and Issue: 4(1), P. 62 - 72

Published: Jan. 25, 2022

Abstract Understanding how the brain learns may lead to machines with human-like intellectual capacities. It was previously proposed that operate on principle of predictive coding. However, it is still not well understood a system could be implemented in brain. Here we demonstrate ability single neuron predict its future activity provide an effective learning mechanism. Interestingly, this rule can derived from metabolic principle, whereby neurons need minimize their own synaptic (cost) while maximizing impact local blood supply by recruiting other neurons. We show mathematically theoretical connection between diverse types brain-inspired algorithm, thus offering step towards development general theory neuronal learning. tested neural network simulations and data recorded awake animals. Our results also suggest spontaneous provides ‘training data’ for learn cortical dynamics. Thus, surprise—that is, difference actual expected activity—could important missing element understand computation

Language: Английский

Citations

72

Spike-driven multi-scale learning with hybrid mechanisms of spiking dendrites DOI
Shuangming Yang,

Yanwei Pang,

Haowen Wang

et al.

Neurocomputing, Journal Year: 2023, Volume and Issue: 542, P. 126240 - 126240

Published: April 26, 2023

Language: Английский

Citations

55

Artificial Neuronal Devices Based on Emerging Materials: Neuronal Dynamics and Applications DOI Creative Commons
Hefei Liu,

Yuan Qin,

Hung‐Yu Chen

et al.

Advanced Materials, Journal Year: 2023, Volume and Issue: 35(37)

Published: Jan. 7, 2023

Abstract Artificial neuronal devices are critical building blocks of neuromorphic computing systems and currently the subject intense research motivated by application needs from new technology more realistic brain emulation. Researchers have proposed a range device concepts that can mimic dynamics functions. Although switching physics structures these artificial neurons largely different, their behaviors be described several neuron models in unified manner. In this paper, reports based on emerging volatile materials reviewed perspective demonstrated models, with focus functions implemented exploitation for computational sensing applications. Furthermore, neuroscience inspirations engineering methods to enrich remain networks toward realizing full functionalities biological discussed.

Language: Английский

Citations

53