Pruning population size in XCS for complex problems

In this paper, we show how to prune the population size of the Learning Classifier System XCS for complex problems. We say a problem is complex, when the number of specified bits of the optimal start classifiers (the problem dimension) is not constant. First, we derive how to estimate an equivalent...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Rakitsch, Barbara, Bernauer, Andreas, Bringmann, Oliver, Rosenstiel, Wolfgang
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, we show how to prune the population size of the Learning Classifier System XCS for complex problems. We say a problem is complex, when the number of specified bits of the optimal start classifiers (the problem dimension) is not constant. First, we derive how to estimate an equivalent problem dimension for complex problems based on the optimal start classifiers. With the equivalent problem dimension, we calculate the optimal maximum population size just like for regular problems, which has already been done. We empirically validate our results. Furthermore, we introduce a subsumption method to reduce the number of classifiers. In contrast to existing methods, we subsume the classifiers after the learning process, so subsuming does not hinder the evolution of optimal classifiers, which has been reported previously. After subsumption, the number of classifiers drops to about the order of magnitude of the optimal classifiers while the correctness rate nearly stays constant.
ISSN:2161-4393
2161-4407
DOI:10.1109/IJCNN.2010.5596377