z-logo
open-access-imgOpen Access
A novel end‐to‐end deep separation network based on attention mechanism for single channel blind separation in wireless communication
Author(s) -
Ma Hao,
Zheng Xiang,
Yu Lu,
Zhou Xingyu,
Chen Yufan
Publication year - 2023
Publication title -
iet signal processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.384
H-Index - 42
eISSN - 1751-9683
pISSN - 1751-9675
DOI - 10.1049/sil2.12173
Subject(s) - computer science , weighting , end to end principle , blind signal separation , encoder , algorithm , artificial intelligence , channel (broadcasting) , pattern recognition (psychology) , telecommunications , medicine , radiology , operating system
Abstract The traditional methods exhibit unstable performance and high complexity for separating co‐frequency modulated wireless communication signals under single‐channel conditions. In this study, an end‐to‐end deep separation network based on the attention mechanism is proposed, which employs the encoder‐separator‐decoder architecture to implement the separation. The encoder can convert the signal into a high‐dimensional feature representation for the separator to achieve separation of the signal in a high‐dimensional space. The core of the proposed deep separation network is the innovative separator, which is mainly composed of attention‐based convolution units with residual connection. The attention‐based convolution unit integrates the large kernel convolution and modified global context (GC) block to simultaneously capture the local and global information of the signal. Furthermore, we improve the GC block to implement channel weighting and location weighting for the given feature map, thus further enhancing the adaptability of the model. The experimental results show that the proposed method not only outperforms the traditional methods in separating mixed signals with the same modulation, but also enables the separation of mixed signals with different modulations.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here