Pattern Generation through Feature Values Modification and Decision Tree Ensemble Construction

An ensemble method produces diverse classifiers and combines their decisions for ensemble's decision. A number of methods have been investigated for constructing ensemble in which some of them train classifiers with the generated patterns. This study investigates a new technique of training pat...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of machine learning and computing 2013-08, Vol.3 (4), p.322-331
Hauptverfasser: Akhand, M. A. H., Rahman, M. M. Hafizur, Murase, K.
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:An ensemble method produces diverse classifiers and combines their decisions for ensemble's decision. A number of methods have been investigated for constructing ensemble in which some of them train classifiers with the generated patterns. This study investigates a new technique of training pattern generation that is easy and effective for ensemble construction. The method modifies feature values of some patterns with the values of other patterns to generate different patterns for different classifiers. The ensemble of decision trees based on the proposed technique was evaluated using a suite of 30 benchmark classification problems, and was found to achieve performance better than or competitive with related conventional methods. Furthermore, two different hybrid ensemble methods have been investigated incorporating the proposed technique of pattern generation with two popular ensemble methods bagging and random subspace method (RSM). It is found that the performance of bagging and RSM algorithms can be improved by incorporating feature values modification with their training processes. Experimental investigation of different types of modification techniques finds that feature values modification with pattern values in the same class is better for generalization.
ISSN:2010-3700
2010-3700
DOI:10.7763/IJMLC.2013.V3.331