Complexity-calibrated Benchmarks for Machine Learning Reveal When Next-Generation Reservoir Computer Predictions Succeed and Mislead DOI Creative Commons

Sarah Marzen,

Paul M. Riechers, James P. Crutchfield

et al.

Research Square (Research Square), Journal Year: 2023, Volume and Issue: unknown

Published: March 30, 2023

Abstract Recurrent neural networks are used to forecast time series in finance, climate, language, and from many other domains. Reservoir computers a particularly easily trainable form of recurrent network. Recently, “next-generation” reservoir computer was introduced which the memory trace involves only finite number previous symbols. We explore inherent limitations finite-past traces this intriguing proposal. A lower bound Fano’s inequality shows that, on highly non-Markovian processes generated by large probabilistic state machines, next-generation with reasonably long have an error probability that is at least ∼ 60% higher than minimal attainable predicting next observation. More generally, it appears popular fall far short optimally such complex processes. These results highlight need for new generation optimized network architectures. Alongside finding, we present concentration-of-measure randomly-generated but One conclusion machines—specifically, ϵ-machines—are key generating challenging structurally-unbiased stimuli ground-truthing PACS numbers: 02.50.-r 05.45.Tp 02.50.Ey 02.50.Ga

Language: Английский

Neural heterogeneity controls computations in spiking neural networks DOI Creative Commons
Richard Gast, Sara A. Solla, Ann Kennedy

et al.

Proceedings of the National Academy of Sciences, Journal Year: 2024, Volume and Issue: 121(3)

Published: Jan. 10, 2024

The brain is composed of complex networks interacting neurons that express considerable heterogeneity in their physiology and spiking characteristics. How does this neural influence macroscopic dynamics, how might it contribute to computation? In work, we use a mean-field model investigate computation heterogeneous networks, by studying the cell thresholds affects three key computational functions population: gating, encoding, decoding signals. Our results suggest serves different types. inhibitory interneurons, varying degree spike threshold allows them gate propagation signals reciprocally coupled excitatory population. Whereas homogeneous interneurons impose synchronized dynamics narrow dynamic repertoire neurons, act as an offset while preserving neuron function. Spike also controls entrainment properties periodic input, thus affecting temporal gating synaptic inputs. Among increases dimensionality improving network’s capacity perform tasks. Conversely, suffer for function generation, but excel at encoding via multistable regimes. Drawing from these findings, propose intra-cell-type mechanism sculpting local circuits permitting same canonical microcircuit be tuned diverse

Language: Английский

Citations

28

Lyapunov spectra of chaotic recurrent neural networks DOI Creative Commons
Rainer Engelken, Fred Wolf, L. F. Abbott

et al.

Physical Review Research, Journal Year: 2023, Volume and Issue: 5(4)

Published: Oct. 16, 2023

The Lyapunov spectrum of recurrent neural networks is calculated and analytical approximations through random matrix theory are provided. dependency attractor dimensions entropy rates on coupling strength input fluctuations identified a point symmetry the revealed. A link shown between exponents to error propagation stability in trained for machine-learning applications.

Language: Английский

Citations

37

Dimension of Activity in Random Neural Networks DOI Creative Commons
David G. Clark, L. F. Abbott, Ashok Litwin-Kumar

et al.

Physical Review Letters, Journal Year: 2023, Volume and Issue: 131(11)

Published: Sept. 11, 2023

Neural networks are high-dimensional nonlinear dynamical systems that process information through the coordinated activity of many connected units. Understanding how biological and machine-learning function learn requires knowledge structure this activity, contained, for example, in cross covariances between Self-consistent mean field theory (DMFT) has elucidated several features random neural networks---in particular, they can generate chaotic activity---however, a calculation using approach not been provided. Here, we calculate self-consistently via two-site cavity DMFT. We use to probe spatiotemporal coordination classic random-network model with independent identically distributed (i.i.d.) couplings, showing an extensive but fractionally low effective dimension long population-level timescale. Our formulas apply wide range single-unit dynamics generalize non-i.i.d. couplings. As example latter, analyze case partially symmetric

Language: Английский

Citations

25

Theory of Coupled Neuronal-Synaptic Dynamics DOI Creative Commons
David G. Clark, L. F. Abbott

Physical Review X, Journal Year: 2024, Volume and Issue: 14(2)

Published: April 1, 2024

In neural circuits, synaptic strengths influence neuronal activity by shaping network dynamics, and influences through activity-dependent plasticity. Motivated this fact, we study a recurrent-network model in which units couplings are interacting dynamic variables, with subject to Hebbian modification decay around quenched random strengths. Rather than assigning specific role the plasticity, use dynamical mean-field theory other techniques systematically characterize neuronal-synaptic revealing rich phase diagram. Adding plasticity slows already chaotic networks can induce chaos otherwise quiescent networks. Anti-Hebbian quickens produces an oscillatory component. Analysis of Jacobian shows that anti-Hebbian push locally unstable modes toward real imaginary axes, respectively, explaining these behaviors. Both random-matrix Lyapunov analysis show strong segregates timescales into two bands, slow, synapse-dominated band driving suggesting flipped view as synapses connected neurons. For increasing strength, initially raises complexity measured maximum exponent attractor dimension, but then decreases metrics, likely due proliferation stable fixed points. We compute marginally spectra such points well their number, showing exponential growth size. Finally, states point dynamics is destabilized allowing any state be stored halting This freezable offers new mechanism for working memory. Published American Physical Society 2024

Language: Английский

Citations

11

Chaotic neural dynamics facilitate probabilistic computations through sampling DOI Creative Commons
Yu Terada, Taro Toyoizumi

Proceedings of the National Academy of Sciences, Journal Year: 2024, Volume and Issue: 121(18)

Published: April 22, 2024

Cortical neurons exhibit highly variable responses over trials and time. Theoretical works posit that this variability arises potentially from chaotic network dynamics of recurrently connected neurons. Here, we demonstrate neural dynamics, formed through synaptic learning, allow networks to perform sensory cue integration in a sampling-based implementation. We show the emergent provide substrates for generating samples not only static but also dynamical trajectory, where generic recurrent acquire these abilities with biologically plausible learning rule trial error. Furthermore, generalize their experience stimulus-evoked inference without partial or all information, which suggests computational role spontaneous activity as representation priors well tractable biological computation marginal distributions. These findings suggest may serve brain function Bayesian generative model.

Language: Английский

Citations

6

Introduction to dynamical mean-field theory of randomly connected neural networks with bidirectionally correlated couplings DOI Creative Commons
Wenxuan Zou, Haiping Huang

SciPost Physics Lecture Notes, Journal Year: 2024, Volume and Issue: unknown

Published: Feb. 20, 2024

Dynamical mean-field theory is a powerful physics tool used to analyze the typical behavior of neural networks, where neurons can be recurrently connected, or multiple layers stacked. However, it not easy for beginners access essence this and underlying physics. Here, we give pedagogical introduction method in particular example random are randomly fully connected by correlated synapses therefore network exhibits rich emergent collective dynamics. We also review related past recent important works applying tool. In addition, physically transparent alternative method, namely dynamical cavity introduced derive exactly same results. The numerical implementation solving integro-differential equations detailed, with an illustration exploring fluctuation dissipation theorem.

Language: Английский

Citations

5

SERT-StructNet: Protein secondary structure prediction method based on multi-factor hybrid deep model DOI Creative Commons
Benzhi Dong, Zheng Liu,

Dali Xu

et al.

Computational and Structural Biotechnology Journal, Journal Year: 2024, Volume and Issue: 23, P. 1364 - 1375

Published: March 22, 2024

Protein secondary structure prediction (PSSP) is a pivotal research endeavour that plays crucial role in the comprehensive elucidation of protein functions and properties. Current methodologies are focused on deep-learning techniques, particularly focusing multi-factor features. Diverging from existing approaches, this study, we placed special emphasis effects amino acid properties propensity scores (SSPs) during meticulous selection This differential feature-selection strategy results distinctive effective amalgamation sequence property To harness these features optimally, introduced hybrid deep feature extraction model. The model initially employs mechanisms such as dilated convolution (D-Conv) channel attention network (SENet) for local targeted enhancement. Subsequently, combination recurrent neural variants (BiGRU BiLSTM), along with transformer module, was employed to achieve global bidirectional information consideration approach input multi-level processing enabled exploration intricate associations among residues sequences, yielding

Language: Английский

Citations

4

Brain mechanism of foraging: Reward-dependent synaptic plasticity versus neural integration of values DOI
Ulises Pereira-Obilinovic, Han Hou, Karel Svoboda

et al.

Proceedings of the National Academy of Sciences, Journal Year: 2024, Volume and Issue: 121(14)

Published: March 29, 2024

During foraging behavior, action values are persistently encoded in neural activity and updated depending on the history of choice outcomes. What is mechanism for value maintenance updating? Here, we explore two contrasting network models: synaptic learning versus integration. We show that both models can reproduce extant experimental data, but they yield distinct predictions about underlying biological circuits. In particular, integrator model not requires reward signals mediated by pools selective alternatives their projections aligned with linear attractor axes valuation system. demonstrate experimentally observable dynamical signatures feasible perturbations to differentiate scenarios, suggesting a more robust candidate mechanism. Overall, this work provides modeling framework guide future research probabilistic foraging.

Language: Английский

Citations

4

Path sampling of recurrent neural networks by incorporating known physics DOI Creative Commons
Sun-Ting Tsai,

Eric Fields,

Yijia Xu

et al.

Nature Communications, Journal Year: 2022, Volume and Issue: 13(1)

Published: Nov. 24, 2022

Recurrent neural networks have seen widespread use in modeling dynamical systems varied domains such as weather prediction, text prediction and several others. Often one wishes to supplement the experimentally observed dynamics with prior knowledge or intuition about system. While recurrent nature of these allows them model arbitrarily long memories time series used training, it makes harder impose through generic constraints. In this work, we present a path sampling approach based on principle Maximum Caliber that us include thermodynamic kinetic constraints into networks. We show method here for widely type network known short-term memory context supplementing collected from different application domains. These classical Molecular Dynamics protein Monte Carlo simulations an open quantum system continuously losing photons environment displaying Rabi oscillations. Our can be easily generalized other generative artificial intelligence models areas physical social sciences, where limited data theory corrections.

Language: Английский

Citations

19

Spectrum of non-Hermitian deep-Hebbian neural networks DOI Creative Commons
Zijian Jiang, Ziming Chen, TianQi Hou

et al.

Physical Review Research, Journal Year: 2023, Volume and Issue: 5(1)

Published: Feb. 8, 2023

Neural networks with recurrent asymmetric couplings are important to understand how episodic memories encoded in the brain. Here, we integrate experimental observation of wide synaptic integration window into our model sequence retrieval continuous time dynamics. The non-normal neuron interactions is theoretically studied by deriving a random matrix theory Jacobian neural spectra bears several distinct features, such as breaking rotational symmetry about origin, and emergence nested voids within spectrum boundary. spectral density thus highly nonuniformly distributed complex plane. also predicts transition chaos. In particular, edge chaos provides computational benefits for sequential memories. Our paper systematic study time-lagged correlations arbitrary delays, can inspire future studies broad class memory models, even big data analysis biological series.

Language: Английский

Citations

10