z-logo
open-access-imgOpen Access
Iterative null space projection method with adaptive thresholding in sparse signal recovery
Author(s) -
Esmaeili Ashkan,
Asadi Kangarshahi Ehsan,
Marvasti Farokh
Publication year - 2018
Publication title -
iet signal processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.384
H-Index - 42
ISSN - 1751-9683
DOI - 10.1049/iet-spr.2016.0626
Subject(s) - thresholding , robustness (evolution) , compressed sensing , computer science , algorithm , iterative method , signal (programming language) , projection (relational algebra) , signal reconstruction , signal to noise ratio (imaging) , artificial intelligence , noise (video) , convergence (economics) , pattern recognition (psychology) , mathematics , signal processing , image (mathematics) , digital signal processing , telecommunications , biochemistry , chemistry , computer hardware , economics , gene , programming language , economic growth
Adaptive thresholding methods have proved to yield a high signal‐to‐noise ratio (SNR) and fast convergence in sparse signal recovery. The robustness of a class of iterative sparse recovery algorithms, such as the iterative method with adaptive thresholding, has been found to outperform the state‐of‐art methods in respect of reconstruction quality, convergence speed, and sensitivity to noise. In this study, the authors introduce a new method for compressed sensing, using the sensing matrix and measurements. In our method, they iteratively threshold the signal and project the thresholded signal onto the translated null space of the sensing matrix. The threshold level is assigned adaptively. The results of the simulations reveal that the authors’ proposed method outperforms other methods in the signal reconstruction (in terms of the SNR). This performance advantage is noticeable when the number of available measurements approaches twice the sparsity number.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here