z-logo
open-access-imgOpen Access
Nonlinear Dimensionality Reduction Based on HSIC Maximization
Author(s) -
Zhengming Ma,
Zengrong Zhan,
Xiaoyuan Ouyang,
Xue Su
Publication year - 2018
Publication title -
ieee access
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.587
H-Index - 127
ISSN - 2169-3536
DOI - 10.1109/access.2018.2871825
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
Hilbert-Schmidt independence criterion (HSIC) is typically used to measure the statistical dependence between two sets of data. HSIC first transforms these two sets of data into two reproducing Kernel Hilbert spaces (RKHS), respectively, and then measures the statistical dependence between them using the Hilbert-Schmidt (HS) operator. This paper proposes a dimension reduction method that is based on HSIC maximization between the high dimensional data and dimension-reduced data, and it is denoted as HSIC-NDR. In the proposed method, the linear kernel is chosen as the kernel function of the RKHS of the low dimensional data after reduction, due to the reason that it can express dimensionality reduction data explicitly from the kernel matrix, thus facilitating the construction of the objective function of the data dimension reduction algorithm. And the kernel function of the RKHS of the original data set can be appropriately chosen according to the specific application. Therefore, the dimension reduction algorithm proposed in this paper can be widely applicable. The experiments are conducted in ten commonly used synthetic and real data sets in the machine learning area. And five representative data dimension reduction algorithms with different properties (linear, nonlinear global, nonlinear local, and nonlinear global + local) are used in the experiment for comparison. The experimental results show that the HSIC-NDR algorithm outperforms those representative algorithms without increasing computational complexity. The proposed HSIC-NDR algorithm and those representative algorithms are all attributed to Rayleigh’s calculations.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom