Pruning population size in XCS for complex problems
In this paper, we show how to prune the population size of the Learning Classifier System XCS for complex problems. We say a problem is complex, when the number of specified bits of the optimal start classifiers (the problem dimension) is not constant. First, we derive how to estimate an equivalent...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 8 |
---|---|
container_issue | |
container_start_page | 1 |
container_title | |
container_volume | |
creator | Rakitsch, Barbara Bernauer, Andreas Bringmann, Oliver Rosenstiel, Wolfgang |
description | In this paper, we show how to prune the population size of the Learning Classifier System XCS for complex problems. We say a problem is complex, when the number of specified bits of the optimal start classifiers (the problem dimension) is not constant. First, we derive how to estimate an equivalent problem dimension for complex problems based on the optimal start classifiers. With the equivalent problem dimension, we calculate the optimal maximum population size just like for regular problems, which has already been done. We empirically validate our results. Furthermore, we introduce a subsumption method to reduce the number of classifiers. In contrast to existing methods, we subsume the classifiers after the learning process, so subsuming does not hinder the evolution of optimal classifiers, which has been reported previously. After subsumption, the number of classifiers drops to about the order of magnitude of the optimal classifiers while the correctness rate nearly stays constant. |
doi_str_mv | 10.1109/IJCNN.2010.5596377 |
format | Conference Proceeding |
fullrecord | <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_5596377</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>5596377</ieee_id><sourcerecordid>5596377</sourcerecordid><originalsourceid>FETCH-LOGICAL-i219t-c96d612a75763ea9c94de213d8fb89108fa6d24efc6c564159c1c4fb199d5b5f3</originalsourceid><addsrcrecordid>eNo1j9tKAzEUReMNbGt_QF_yA1Nzcp3zKIOXSqmCCr6VTCaRyMxkmLSgfr2C7dNis2DDIuQS2AKA4fXysVqvF5z9baVQC2OOyBQkl1IjGDgmEw4aCimZOSFzNOXBaTg9OIHinExz_mSMC0QxIeJ53PWx_6BDGnat3cbU0xx_PI09fa9eaEgjdakbWv9FhzHVre_yBTkLts1-vueMvN3dvlYPxerpflndrIrIAbeFQ91o4NYoo4W36FA2noNoylCXCKwMVjdc-uC0U1qCQgdOhhoQG1WrIGbk6v83eu83wxg7O35v9vHiFyCGSSU</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Pruning population size in XCS for complex problems</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Rakitsch, Barbara ; Bernauer, Andreas ; Bringmann, Oliver ; Rosenstiel, Wolfgang</creator><creatorcontrib>Rakitsch, Barbara ; Bernauer, Andreas ; Bringmann, Oliver ; Rosenstiel, Wolfgang</creatorcontrib><description>In this paper, we show how to prune the population size of the Learning Classifier System XCS for complex problems. We say a problem is complex, when the number of specified bits of the optimal start classifiers (the problem dimension) is not constant. First, we derive how to estimate an equivalent problem dimension for complex problems based on the optimal start classifiers. With the equivalent problem dimension, we calculate the optimal maximum population size just like for regular problems, which has already been done. We empirically validate our results. Furthermore, we introduce a subsumption method to reduce the number of classifiers. In contrast to existing methods, we subsume the classifiers after the learning process, so subsuming does not hinder the evolution of optimal classifiers, which has been reported previously. After subsumption, the number of classifiers drops to about the order of magnitude of the optimal classifiers while the correctness rate nearly stays constant.</description><identifier>ISSN: 2161-4393</identifier><identifier>ISBN: 9781424469161</identifier><identifier>ISBN: 1424469163</identifier><identifier>EISSN: 2161-4407</identifier><identifier>EISBN: 1424469171</identifier><identifier>EISBN: 9781424469178</identifier><identifier>EISBN: 142446918X</identifier><identifier>EISBN: 9781424469185</identifier><identifier>DOI: 10.1109/IJCNN.2010.5596377</identifier><language>eng</language><publisher>IEEE</publisher><subject>Degradation ; Estimation ; Gallium ; Genetic algorithms ; Probability ; Resource management ; System-on-a-chip</subject><ispartof>The 2010 International Joint Conference on Neural Networks (IJCNN), 2010, p.1-8</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/5596377$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,776,780,785,786,2052,27902,54895</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/5596377$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Rakitsch, Barbara</creatorcontrib><creatorcontrib>Bernauer, Andreas</creatorcontrib><creatorcontrib>Bringmann, Oliver</creatorcontrib><creatorcontrib>Rosenstiel, Wolfgang</creatorcontrib><title>Pruning population size in XCS for complex problems</title><title>The 2010 International Joint Conference on Neural Networks (IJCNN)</title><addtitle>IJCNN</addtitle><description>In this paper, we show how to prune the population size of the Learning Classifier System XCS for complex problems. We say a problem is complex, when the number of specified bits of the optimal start classifiers (the problem dimension) is not constant. First, we derive how to estimate an equivalent problem dimension for complex problems based on the optimal start classifiers. With the equivalent problem dimension, we calculate the optimal maximum population size just like for regular problems, which has already been done. We empirically validate our results. Furthermore, we introduce a subsumption method to reduce the number of classifiers. In contrast to existing methods, we subsume the classifiers after the learning process, so subsuming does not hinder the evolution of optimal classifiers, which has been reported previously. After subsumption, the number of classifiers drops to about the order of magnitude of the optimal classifiers while the correctness rate nearly stays constant.</description><subject>Degradation</subject><subject>Estimation</subject><subject>Gallium</subject><subject>Genetic algorithms</subject><subject>Probability</subject><subject>Resource management</subject><subject>System-on-a-chip</subject><issn>2161-4393</issn><issn>2161-4407</issn><isbn>9781424469161</isbn><isbn>1424469163</isbn><isbn>1424469171</isbn><isbn>9781424469178</isbn><isbn>142446918X</isbn><isbn>9781424469185</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2010</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNo1j9tKAzEUReMNbGt_QF_yA1Nzcp3zKIOXSqmCCr6VTCaRyMxkmLSgfr2C7dNis2DDIuQS2AKA4fXysVqvF5z9baVQC2OOyBQkl1IjGDgmEw4aCimZOSFzNOXBaTg9OIHinExz_mSMC0QxIeJ53PWx_6BDGnat3cbU0xx_PI09fa9eaEgjdakbWv9FhzHVre_yBTkLts1-vueMvN3dvlYPxerpflndrIrIAbeFQ91o4NYoo4W36FA2noNoylCXCKwMVjdc-uC0U1qCQgdOhhoQG1WrIGbk6v83eu83wxg7O35v9vHiFyCGSSU</recordid><startdate>20100101</startdate><enddate>20100101</enddate><creator>Rakitsch, Barbara</creator><creator>Bernauer, Andreas</creator><creator>Bringmann, Oliver</creator><creator>Rosenstiel, Wolfgang</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope></search><sort><creationdate>20100101</creationdate><title>Pruning population size in XCS for complex problems</title><author>Rakitsch, Barbara ; Bernauer, Andreas ; Bringmann, Oliver ; Rosenstiel, Wolfgang</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i219t-c96d612a75763ea9c94de213d8fb89108fa6d24efc6c564159c1c4fb199d5b5f3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2010</creationdate><topic>Degradation</topic><topic>Estimation</topic><topic>Gallium</topic><topic>Genetic algorithms</topic><topic>Probability</topic><topic>Resource management</topic><topic>System-on-a-chip</topic><toplevel>online_resources</toplevel><creatorcontrib>Rakitsch, Barbara</creatorcontrib><creatorcontrib>Bernauer, Andreas</creatorcontrib><creatorcontrib>Bringmann, Oliver</creatorcontrib><creatorcontrib>Rosenstiel, Wolfgang</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Rakitsch, Barbara</au><au>Bernauer, Andreas</au><au>Bringmann, Oliver</au><au>Rosenstiel, Wolfgang</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Pruning population size in XCS for complex problems</atitle><btitle>The 2010 International Joint Conference on Neural Networks (IJCNN)</btitle><stitle>IJCNN</stitle><date>2010-01-01</date><risdate>2010</risdate><spage>1</spage><epage>8</epage><pages>1-8</pages><issn>2161-4393</issn><eissn>2161-4407</eissn><isbn>9781424469161</isbn><isbn>1424469163</isbn><eisbn>1424469171</eisbn><eisbn>9781424469178</eisbn><eisbn>142446918X</eisbn><eisbn>9781424469185</eisbn><abstract>In this paper, we show how to prune the population size of the Learning Classifier System XCS for complex problems. We say a problem is complex, when the number of specified bits of the optimal start classifiers (the problem dimension) is not constant. First, we derive how to estimate an equivalent problem dimension for complex problems based on the optimal start classifiers. With the equivalent problem dimension, we calculate the optimal maximum population size just like for regular problems, which has already been done. We empirically validate our results. Furthermore, we introduce a subsumption method to reduce the number of classifiers. In contrast to existing methods, we subsume the classifiers after the learning process, so subsuming does not hinder the evolution of optimal classifiers, which has been reported previously. After subsumption, the number of classifiers drops to about the order of magnitude of the optimal classifiers while the correctness rate nearly stays constant.</abstract><pub>IEEE</pub><doi>10.1109/IJCNN.2010.5596377</doi><tpages>8</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 2161-4393 |
ispartof | The 2010 International Joint Conference on Neural Networks (IJCNN), 2010, p.1-8 |
issn | 2161-4393 2161-4407 |
language | eng |
recordid | cdi_ieee_primary_5596377 |
source | IEEE Electronic Library (IEL) Conference Proceedings |
subjects | Degradation Estimation Gallium Genetic algorithms Probability Resource management System-on-a-chip |
title | Pruning population size in XCS for complex problems |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-10T11%3A27%3A29IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Pruning%20population%20size%20in%20XCS%20for%20complex%20problems&rft.btitle=The%202010%20International%20Joint%20Conference%20on%20Neural%20Networks%20(IJCNN)&rft.au=Rakitsch,%20Barbara&rft.date=2010-01-01&rft.spage=1&rft.epage=8&rft.pages=1-8&rft.issn=2161-4393&rft.eissn=2161-4407&rft.isbn=9781424469161&rft.isbn_list=1424469163&rft_id=info:doi/10.1109/IJCNN.2010.5596377&rft_dat=%3Cieee_6IE%3E5596377%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&rft.eisbn=1424469171&rft.eisbn_list=9781424469178&rft.eisbn_list=142446918X&rft.eisbn_list=9781424469185&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=5596377&rfr_iscdi=true |