z-logo
open-access-imgOpen Access
Embedding decomposition for artifacts removal in EEG signals
Author(s) -
Junjie Yu,
Chenyi Li,
Kexin Lou,
Wei Chen,
Quanying Liu
Publication year - 2022
Publication title -
journal of neural engineering
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.594
H-Index - 111
eISSN - 1741-2560
pISSN - 1741-2552
DOI - 10.1088/1741-2552/ac63eb
Subject(s) - computer science , artifact (error) , electroencephalography , artificial intelligence , interpretability , pattern recognition (psychology) , signal (programming language) , embedding , autoencoder , deep learning , speech recognition , psychology , psychiatry , programming language
Objective. Electroencephalogram (EEG) recordings are often contaminated with artifacts. Various methods have been developed to eliminate or weaken the influence of artifacts. However, most of them rely on prior experience for analysis. Approach. Here, we propose an deep learning framework to separate neural signal and artifacts in the embedding space and reconstruct the denoised signal, which is called DeepSeparator. DeepSeparator employs an encoder to extract and amplify the features in the raw EEG, a module called decomposer to extract the trend, detect and suppress artifact and a decoder to reconstruct the denoised signal. Besides, DeepSeparator can extract the artifact, which largely increases the model interpretability. Main results. The proposed method is tested with a semi-synthetic EEG dataset and a real task-related EEG dataset, suggesting that DeepSeparator outperforms the conventional models in both EOG and EMG artifact removal. Significance. DeepSeparator can be extended to multi-channel EEG and data with any arbitrary length. It may motivate future developments and application of deep learning-based EEG denoising. The code for DeepSeparator is available athttps://github.com/ncclabsustech/DeepSeparator.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here