
Опубликована: Янв. 1, 2024
Язык: Английский
Опубликована: Янв. 1, 2024
Язык: Английский
2021 IEEE/CVF International Conference on Computer Vision (ICCV), Год журнала: 2023, Номер unknown, С. 6532 - 6542
Опубликована: Окт. 1, 2023
Spiking neural networks (SNNs) are brain-inspired energy-efficient models that encode information in spatiotemporal dynamics. Recently, deep SNNs trained directly have shown great success achieving high performance on classification tasks with very few time steps. However, how to design a directly-trained SNN for the regression task of object detection still remains challenging problem. To address this problem, we propose EMS-YOLO, novel framework detection, which is first trial train surrogate gradients rather than ANN-SNN conversion strategies. Specifically, full-spike residual block, EMS-ResNet, can effectively extend depth low power consumption. Furthermore, theoretically analyze and prove EMS-ResNet could avoid gradient vanishing or exploding. The results demonstrate our approach outperforms state-of-the-art methods (at least 500 steps) extremely fewer steps (only 4 steps). It model achieve comparable ANN same architecture while consuming 5.83× less energy frame-based COCO Dataset event-based Gen1 Dataset. Our code available https://github.com/BICLab/EMS-YOLO.
Язык: Английский
Процитировано
24APL Materials, Год журнала: 2024, Номер 12(10)
Опубликована: Окт. 1, 2024
Язык: Английский
Процитировано
10ACS Applied Electronic Materials, Год журнала: 2025, Номер unknown
Опубликована: Фев. 14, 2025
A brain-inspired computing paradigm known as "neuromorphic computing" seeks to replicate the information processing processes of biological neural systems in order create that are effective, low-power, and adaptable. Spiking networks (SNNs) based on a single device at forefront computing, which aims mimic powers human brain. Neuromorphic devices, enable hardware implementation artificial (ANNs), heart neuromorphic computing. These devices dynamics functions neurons synapses. This mini-review assesses latest advancements with an emphasis small, energy-efficient synapses neurons. Key like spike-timing-dependent plasticity, multistate storage, dynamic filtering demonstrated by variety single-device models, such memristors, transistors, magnetic ferroelectric devices. The integrate-and-fire (IF) neuron is key model these because it allows for mathematical analysis while successfully capturing aspects processing. review examines potential SNNs scalable, low-power applications, highlighting both benefits constraints implementing them architectures. highlights increasing importance creation flexible cognitive
Язык: Английский
Процитировано
1IEEE Transactions on Cognitive and Developmental Systems, Год журнала: 2023, Номер 16(2), С. 544 - 558
Опубликована: Май 26, 2023
Spiking neural networks (SNNs) have demonstrated excellent capabilities in various intelligent scenarios. Most existing methods for training SNNs are based on the concept of synaptic plasticity; however, learning realistic brain also utilizes intrinsic non-synaptic mechanisms neurons. The spike threshold biological neurons is a critical neuronal feature that exhibits rich dynamics millisecond timescale and has been proposed as an underlying mechanism facilitates information processing. In this study, we develop novel synergistic approach involves simultaneously weights thresholds SNNs. trained with synapse-threshold (STL-SNNs) achieve significantly superior performance static neuromorphic datasets than two degenerated single-learning models. During training, optimizes thresholds, providing network stable signal transmission via appropriate firing rates. Further analysis indicates STL-SNNs robust to noisy data exhibit low energy consumption deep structures. Additionally, STL-SNN can be further improved by introducing generalized joint decision framework. Overall, our findings indicate biologically plausible synergies between may provide promising developing highly efficient SNN methods.
Язык: Английский
Процитировано
122021 IEEE/CVF International Conference on Computer Vision (ICCV), Год журнала: 2023, Номер unknown, С. 16878 - 16888
Опубликована: Окт. 1, 2023
Spiking Neural Networks (SNNs) are well known as a promising energy-efficient alternative to conventional artificial neural networks. Subject the preconceived impression that SNNs sparse firing, analysis and optimization of inherent redundancy in have been largely overlooked, thus potential advantages spike-based neuromorphic computing accuracy energy efficiency interfered. In this work, we pose focus on three key questions regarding SNNs. We argue is induced by spatio-temporal invariance SNNs, which enhances parameter utilization but also invites lots noise spikes. Further, analyze effect dynamics spike firing Then, motivated these analyses, propose an Advance Spatial Attention (ASA) module harness SNNs' redundancy, can adaptively optimize their membrane distribution pair individual spatial attention sub-modules. way, features accurately regulated. Experimental results demonstrate proposed method significantly drop with better performance than state-of-the-art SNN baselines. Our code available https://github.com/BICLab/ASA-SNN.
Язык: Английский
Процитировано
11Frontiers in Neuroscience, Год журнала: 2025, Номер 19
Опубликована: Март 12, 2025
Introduction Spiking Neural Networks (SNNs) offer a biologically inspired alternative to conventional artificial neural networks, with potential advantages in power efficiency due their event-driven computation. Despite promise, SNNs have yet achieve competitive performance on complex visual tasks, such as image classification. Methods This study introduces novel SNN architecture called SpikeAtConv, designed enhance computational efficacy and task accuracy. The features optimized spiking modules that facilitate the processing of spatio-temporal patterns data, aiming reconcile demands high-level vision tasks energy-efficient SNNs. Results Extensive experiments show proposed SpikeAtConv outperforms or is comparable state-of-the-art datasets. Notably, we achieved top-1 accuracy 81.23% ImageNet-1K using directly trained Large which result field SNN. Discussion Our evaluations standard classification benchmarks indicate narrows gap traditional providing insights into design more efficient capable neuromorphic computing systems.
Язык: Английский
Процитировано
0Neural Networks, Год журнала: 2025, Номер unknown, С. 107395 - 107395
Опубликована: Март 1, 2025
Язык: Английский
Процитировано
0Neural Networks, Год журнала: 2024, Номер 176, С. 106330 - 106330
Опубликована: Апрель 21, 2024
Язык: Английский
Процитировано
1Neural Networks, Год журнала: 2024, Номер 184, С. 107043 - 107043
Опубликована: Дек. 16, 2024
Язык: Английский
Процитировано
1Neural Networks, Год журнала: 2024, Номер 180, С. 106630 - 106630
Опубликована: Авг. 22, 2024
Язык: Английский
Процитировано
0