z-logo
open-access-imgOpen Access
Improvement of the Deep Forest Classifier by a Set of Neural Networks
Author(s) -
Lev V. Utkin,
Kirill D. Zhuk
Publication year - 2020
Publication title -
informatica
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.172
H-Index - 34
eISSN - 1854-3871
pISSN - 0350-5596
DOI - 10.31449/inf.v44i1.2740
Subject(s) - random forest , artificial neural network , artificial intelligence , computer science , decision tree , classifier (uml) , machine learning , deep neural networks , set (abstract data type) , class (philosophy) , pattern recognition (psychology) , programming language
A Neural Random Forest (NeuRF) and a Neural Deep Forest (NeuDF) as classification algorithms, which combine an ensemble of decision trees and neural networks, are proposed in the paper. The main idea underlying NeuRF is to combine the class probability distributions produced by decision trees by means of a set of neural networks with shared parameters. The networks are trained in accordance with a loss function which measures the classification error. Every neural network can be viewed as a non-linear function of probabilities of a class. NeuDF is a modification of the Deep Forest or gcForest proposed by Zhou and Feng, using NeuRFs. The numerical experiments illustrate the outperformance of NeuDF and show that the NeuRF is comparable with the random forest

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom