An Optimized Neural Network Classification Method Based on Kernel Holistic Learning and Division
Author(s) -
Hui Wen,
Tongbin Li,
Deli Chen,
Jianlu Yang,
Yan Che
Publication year - 2021
Publication title -
mathematical problems in engineering
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.262
H-Index - 62
eISSN - 1026-7077
pISSN - 1024-123X
DOI - 10.1155/2021/8857818
Subject(s) - radial basis function kernel , artificial intelligence , pattern recognition (psychology) , subspace topology , artificial neural network , kernel method , computer science , kernel (algebra) , classifier (uml) , radial basis function , division (mathematics) , robustness (evolution) , kernel embedding of distributions , machine learning , mathematics , support vector machine , biochemistry , chemistry , arithmetic , combinatorics , gene
An optimized neural network classification method based on kernel holistic learning and division (KHLD) is presented. The proposed method is based on the learned radial basis function (RBF) kernel as the research object. The kernel proposed here can be considered a subspace region consisting of the same pattern category in the training sample space. By extending the region of the sample space of the original instances, relevant information between instances can be obtained from the subspace, and the classifier’s boundary can be far from the original instances; thus, the robustness and generalization performance of the classifier are enhanced. In concrete implementation, a new pattern vector is generated within each RBF kernel according to the instance optimization and screening method to characterize KHLD. Experiments on artificial datasets and several UCI benchmark datasets show the effectiveness of our method.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom