An imprecise deep forest for classification
•A modification of the deep forest incorporating the imprecision is proposed.•The modification is based on applying the imprecise Dirichlet model.•An algorithm implementing the modification is provided.•The algorithm is compared with the deep forest by using real data. An imprecise deep forest class...
Gespeichert in:
Veröffentlicht in: | Expert systems with applications 2020-03, Vol.141, p.112978, Article 112978 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •A modification of the deep forest incorporating the imprecision is proposed.•The modification is based on applying the imprecise Dirichlet model.•An algorithm implementing the modification is provided.•The algorithm is compared with the deep forest by using real data.
An imprecise deep forest classifier, which can be regarded as a modification of the deep forest proposed by Zhou and Feng, is presented in the paper. In the proposed classifier, the precise class probabilities at leaf nodes of decision trees in the deep forest are replaced with interval-valued probabilities produced by Walley’s imprecise Dirichlet model. The obtained sets of probabilities are averaged over all trees in a random forest as the imprecise probability masses. In order to use the stacking algorithm in the deep forest, a number of the class probability distributions are generated from sets of the random forest class probabilities such that decision trees of the next level of the deep forest cascade are trained on the basis of the extended training set which is added by new generated class distributions. Numerical examples illustrate how the incorporated imprecision may lead to outperforming results, especially, when the training set is rather small. The proposed classifier is the first modification of the deep forest, which efficiently deals with small datasets and takes into account a lack of sufficient training data. The ideas underlying the classifier may be a basis for development of a set on new machine learning models for small datasets. |
---|---|
ISSN: | 0957-4174 1873-6793 |
DOI: | 10.1016/j.eswa.2019.112978 |