z-logo
open-access-imgOpen Access
Robust knowledge‐aided sparse recovery STAP method for non‐homogeneity clutter suppression
Author(s) -
Peng Hao,
Sun Yuze,
Yang Xiaopeng,
Yang Jian
Publication year - 2019
Publication title -
the journal of engineering
Language(s) - English
Resource type - Journals
ISSN - 2051-3305
DOI - 10.1049/joe.2019.0273
Subject(s) - clutter , subspace topology , constant false alarm rate , computer science , radar , outlier , covariance matrix , artificial intelligence , robustness (evolution) , phased array , pattern recognition (psychology) , computer vision , algorithm , telecommunications , biochemistry , chemistry , antenna (radio) , gene
Conventional space–time adaptive processing (STAP) methods would suffer severely performance loss in the complex clutter environment of an airborne‐phased array radar, especially when the estimated clutter covariance matrix (CCM) is corrupted by the interference targets (outliers). In order to improve the clutter suppression in a practical complex clutter, a robust knowledge‐aided sparse recovery STAP method for non‐homogeneity clutter suppression is proposed. In the proposed method, the spectral profiles of the clutter and outliers are firstly estimated by sparse recovery processing. Then based on the system prior parameters, the clutter mask is constructed to select the space–time steering vectors corresponding to the clutter components. Afterwards the clutter suppression is achieved based on the clutter subspace obtained from the selected space–time steering vectors. Since the clutter and outlier profiles are effectively estimated and distinguished by the knowledge‐aided sparse recovery processing, robust clutter subspace estimation can be achieved for clutter suppression. Through the simulated and actual airborne‐phased array radar data, it is verified that the proposed method can effectively improve the STAP performance in a non‐homogeneous environment.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here