Incorporating Bagging into Boosting
In classification learning, classifier group learning approach provides better results to predict accuracy. There are various classifiers to form a group by repeating single based learning algorithm. The members of the group make a final classification by voting. Boosting and Bagging are two popular...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In classification learning, classifier group learning approach provides better results to predict accuracy. There are various classifiers to form a group by repeating single based learning algorithm. The members of the group make a final classification by voting. Boosting and Bagging are two popular methods from this group and decreases error rate of decision tree learning. Boosting is more accurate than Bagging, but the former is more variable than the later. In this paper, our aim is to review the state of the art on group learning techniques in the frame work of imbalanced data sets. We propose a new group learning algorithm called Incorporating Bagging in to Boosting (IB), which creates number of subgroups by incorporating Bagging into Boosting. Experimental results on natural domains show that on an average IB is more stable than either Bagging or Boosting. It is more stable than Boosting. These characteristics make IB a good choice as one of the group learning techniques. |
---|---|
DOI: | 10.1109/HIS.2012.6421375 |