z-logo
open-access-imgOpen Access
Non-negative sparse Laplacian regularized latent multi-view subspace clustering
Author(s) -
Congzhe You,
Zhenqiu Shu,
Huiqing Fan
Publication year - 2021
Publication title -
journal of algorithms and computational technology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.234
H-Index - 13
eISSN - 1748-3026
pISSN - 1748-3018
DOI - 10.1177/17483026211024904
Subject(s) - cluster analysis , subspace topology , computer science , artificial intelligence , pattern recognition (psychology) , representation (politics) , regularization (linguistics) , graph , sparse approximation , data point , data mining , mathematics , theoretical computer science , politics , political science , law
Recently, in the area of artificial intelligence and machine learning, subspace clustering of multi-view data is a research hotspot. The goal is to divide data samples from different sources into different groups. We proposed a new subspace clustering method for multi-view data which termed as Non-negative Sparse Laplacian regularized Latent Multi-view Subspace Clustering (NSL2MSC) in this paper. The method proposed in this paper learns the latent space representation of multi view data samples, and performs the data reconstruction on the latent space. The algorithm can cluster data in the latent representation space and use the relationship of different views. However, the traditional representation-based method does not consider the non-linear geometry inside the data, and may lose the local and similar information between the data in the learning process. By using the graph regularization method, we can not only capture the global low dimensional structural features of data, but also fully capture the nonlinear geometric structure information of data. The experimental results show that the proposed method is effective and its performance is better than most of the existing alternatives.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here