z-logo
open-access-imgOpen Access
A Transfer Learning Method for Deep Networks with Small Sample Sizes
Author(s) -
Xin Zheng,
Luyue Lin,
Shouzhi Liang,
Bo Rao,
Ruidian Zhan
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1631/1/012072
Subject(s) - computer science , artificial intelligence , transfer of learning , regularization (linguistics) , machine learning , support vector machine , similarity (geometry) , deep learning , hinge loss , context (archaeology) , sample (material) , representation (politics) , pattern recognition (psychology) , image (mathematics) , paleontology , chemistry , chromatography , politics , political science , law , biology
Transfer learning is that a machine learning model learns knowledge from more than one domain, and it is applied to the context of small sample size. Some of approaches concentrate on the correlation determination among all domains while some pay more attention on knowledge transfer among all domains. In this paper, on the basic of SVM with hinge loss, a new regularized transfer learning deep network with a specific regularization is proposed, in which a deep network learns high level representation with respect to the given samples. And a part of parameters in SVM are shared such that the similarity of data distribution can be well captured. Besides, a modified regularized SVM is exploited such that the gradient based method is feasible, which yields a parallel implementation of the proposed method. After that, in the experiment part, the comparison of our approach with state-of-the-art approaches manifests the competitive performance and the feasibility in classification.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here