Neural Networks, Journal Year: 2024, Volume and Issue: 180, P. 106734 - 106734
Published: Sept. 25, 2024
Language: Английский
Neural Networks, Journal Year: 2024, Volume and Issue: 180, P. 106734 - 106734
Published: Sept. 25, 2024
Language: Английский
Sensors, Journal Year: 2025, Volume and Issue: 25(4), P. 1147 - 1147
Published: Feb. 13, 2025
Decoding motor imagery electroencephalography (MI-EEG) signals presents significant challenges due to the difficulty in capturing complex functional connectivity between channels and temporal dependencies of EEG across different periods. These are exacerbated by low spatial resolution high signal redundancy inherent signals, which traditional linear models struggle address. To overcome these issues, we propose a novel dual-branch framework that integrates an adaptive graph convolutional network (Adaptive GCN) bidirectional gated recurrent units (Bi-GRUs) enhance decoding performance MI-EEG effectively modeling both channel correlations dependencies. The Chebyshev Type II filter decomposes into multiple sub-bands giving model frequency domain insights. Adaptive GCN, specifically designed for context, captures more than conventional GCN models, enabling accurate spatial-spectral feature extraction. Furthermore, combining Bi-GRU Multi-Head Attention (MHA) time segments extract deep time-spectral features. Finally, fusion is performed generate final prediction results. Experimental results demonstrate our method achieves average classification accuracy 80.38% on BCI-IV Dataset 2a 87.49% BCI-I 3a, outperforming other state-of-the-art approaches. This approach lays foundation future exploration personalized brain-computer interface (BCI) systems.
Language: Английский
Citations
1NeuroImage, Journal Year: 2025, Volume and Issue: unknown, P. 121123 - 121123
Published: March 1, 2025
Artifact removal in electroencephalography (EEG) is a longstanding challenge that significantly impacts neuroscientific analysis and brain-computer interface (BCI) performance. Tackling this problem demands advanced algorithms, extensive noisy-clean training data, thorough evaluation strategies. This study presents the Removal Transformer (ART), an innovative EEG denoising model employing transformer architecture to adeptly capture transient millisecond-scale dynamics characteristic of signals. Our approach offers holistic, end-to-end solution simultaneously addresses multiple artifact types multichannel data. We enhanced generation data pairs using independent component analysis, thus fortifying scenarios critical for effective supervised learning. performed comprehensive validations wide range open datasets from various BCI applications, metrics like mean squared error signal-to-noise ratio, as well sophisticated techniques such source localization classification. evaluations confirm ART surpasses other deep-learning-based methods, setting new benchmark signal processing. advancement not only boosts accuracy reliability but also promises catalyze further innovations field, facilitating brain naturalistic environments.
Language: Английский
Citations
0Neural Networks, Journal Year: 2025, Volume and Issue: unknown, P. 107511 - 107511
Published: April 1, 2025
Language: Английский
Citations
0Neural Networks, Journal Year: 2024, Volume and Issue: 180, P. 106734 - 106734
Published: Sept. 25, 2024
Language: Английский
Citations
2