Mutual information based input feature selection for classification problems

The elimination process aims to reduce the size of the input feature set and at the same time to retain the class discriminatory information for classification problems. This paper investigates the approaches to solve classification problems of the feature selection and proposes a new feature select...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Decision Support Systems 2012-12, Vol.54 (1), p.691-698
Hauptverfasser: Cang, Shuang, Yu, Hongnian
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The elimination process aims to reduce the size of the input feature set and at the same time to retain the class discriminatory information for classification problems. This paper investigates the approaches to solve classification problems of the feature selection and proposes a new feature selection algorithm using the mutual information (MI) concept in information theory for the classification problems. The proposed algorithm calculates the MI between the combinations of input features and the class instead of the MI between a single input feature and the class for both continuous-valued and discrete-valued features. Three experimental tests are conducted to evaluate the proposed algorithm. Comparison studies of the proposed algorithm with the previously published classification algorithms indicate that the proposed algorithm is robust, stable and efficient. ► Propose an algorithm using mutual information between combinations of input features and the class ► The proposed algorithm can effectively identify the optimal feature space of input features. ► The proposed algorithm has significant improvement for classification problems. ► Experimental results demonstrate that the proposed algorithm is robust, stable and efficient.
ISSN:0167-9236
1873-5797
DOI:10.1016/j.dss.2012.08.014