z-logo
open-access-imgOpen Access
A novel noise reduction technique for underwater acoustic signals based on dual‐path recurrent neural network
Author(s) -
Song Yongqiang,
Liu Feng,
Shen Tongsheng
Publication year - 2023
Publication title -
iet communications
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.355
H-Index - 62
eISSN - 1751-8636
pISSN - 1751-8628
DOI - 10.1049/cmu2.12518
Subject(s) - computer science , path (computing) , noise reduction , artificial neural network , noise (video) , pattern recognition (psychology) , signal (programming language) , convolutional neural network , underwater , feature extraction , recurrent neural network , reduction (mathematics) , artificial intelligence , feature (linguistics) , speech recognition , mathematics , linguistics , oceanography , philosophy , geometry , image (mathematics) , programming language , geology
Abstract A dual‐path recurrent neural network model is proposed to achieve noise reduction of underwater acoustic signals, which consists of three steps: feature extraction, mask separation, and signal recovery. For feature extraction, we use a multi‐scale convolutional neural network to extract higher‐order non‐linear features of the input signal and chunk the obtained non‐linear features into subvectors with fixed lengths according to the temporal dimension. In the mask separation, two recurrent neural networks based on constructing a dual‐path network, a bidirectional network for extracting intra‐features, and a directional network for extracting inter‐features is build. Finally, overlapping and permutation neural networks are used to recover the denoised acoustic signal. By comparing different denoising methods, it can be seen that this method is effective in underwater acoustic signals. By evaluating the two tasks of the ShipsEar dataset, this model can improve the signal‐to‐noise ratio by 12.02 and 9.48 dB, respectively.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here