A wide interpretable Gaussian Takagi–Sugeno–Kang fuzzy classifier and its incremental learning
While wide fuzzy classifiers which combine several TSK fuzzy sub-classifiers with some aggregation strategy like weighting have been widely developed to achieve good classification performance and/or interpretability, the aggregation strategies inevitably deteriorate the interpretability of the whol...
Gespeichert in:
Veröffentlicht in: | Knowledge-based systems 2022-04, Vol.241, p.108203, Article 108203 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | 108203 |
container_title | Knowledge-based systems |
container_volume | 241 |
creator | Xie, Runshan Wang, Shitong |
description | While wide fuzzy classifiers which combine several TSK fuzzy sub-classifiers with some aggregation strategy like weighting have been widely developed to achieve good classification performance and/or interpretability, the aggregation strategies inevitably deteriorate the interpretability of the whole ensemble and hinder them from developing the corresponding incremental algorithm easily yet efficiently for online learning. To avoid this weakness and ensure enhanced generalization capability and interpretability, this work attempts to develop a novel wide interpretable Gaussian Takagi–Sugeno–Kang (TSK) fuzzy classifier (WIG-TSK) by simultaneously training all the Gaussian TSK fuzzy sub-classifiers without any aggregation strategy, just like a single structure. Each fuzzy rule of WIG-TSK generates its Gaussian fuzzy antecedent on randomly chosen features by exerting the Gaussian random selection on the original features of data. WIG-TSK has four merits: (1) Gaussian random selection of features guarantees both its easy interpretability and the avoidance of rule-explosion curse, and hence WIG-TSK is suitable to deal with multi- and even high-dimensional data. (2) all the sub-classifiers of WIG-TSK can be simultaneously trained in parallel without any aggregation strategy, and hence the corresponding incremental learning can be developed to dynamically update the whole structure of WIG-TSK. (3) WIG-TSK has a solid theoretical guarantee to be structurally equivalent to a more complicated zero-order Gaussian TSK fuzzy classifier. (4) WIG-TSK theoretically has better generalization capability than the corresponding wide combination structure. The effectiveness of WIG-TSK is manifested by the experimental results on eighteen benchmarking datasets and a commonly-used UCI dataset in terms of classification accuracy, running speed and interpretability.
•We design a wide interpretable TSK fuzzy classifier WIG-TSK which can simultaneously train all its sub-classifier without aggregation strategy.•We provide the theoretical justifications for WIG-TSK about its enhanced generalization capability and structure equivalence.•We develop the incremental version of WIG-TSK for online learning. |
doi_str_mv | 10.1016/j.knosys.2022.108203 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2642938871</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0950705122000521</els_id><sourcerecordid>2642938871</sourcerecordid><originalsourceid>FETCH-LOGICAL-c334t-a375fcd765d645d04a824864ac06d390909a71dcd0cc3cdace17b340405a759e3</originalsourceid><addsrcrecordid>eNp9kMFq3DAQhkVJoZs0b9CDIGdvRpZs2ZdCWJpN6EIPTc9iVhov2jjyRrIbNqe-Q98wTxIt7rnMYYbhm3_4f8a-CFgKEPX1fvkYhnRMyxLKMq-aEuQHthCNLgutoD1jC2grKDRU4hM7T2kPkEnRLNj2hr94R9yHkeIh0ojbnvgap5Q8Bv6Aj7jzb3_-_px2FIY8fMew4930-nrktsdMdZ4ix-C4H1OWsZGeKIzY854wBh92n9nHDvtEl__6Bft1--1hdVdsfqzvVzebwkqpxgKlrjrrdF25WlUOFDalamqFFmonW8iFWjjrwFppHVoSeisVKKhQVy3JC3Y16x7i8DxRGs1-mGLIL01Zq7KVTaNFptRM2TikFKkzh-ifMB6NAHNK0-zNnKY5pWnmNPPZ1_mMsoPf2bJJ1lOw5HwkOxo3-P8LvAMtroKj</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2642938871</pqid></control><display><type>article</type><title>A wide interpretable Gaussian Takagi–Sugeno–Kang fuzzy classifier and its incremental learning</title><source>Elsevier ScienceDirect Journals</source><creator>Xie, Runshan ; Wang, Shitong</creator><creatorcontrib>Xie, Runshan ; Wang, Shitong</creatorcontrib><description>While wide fuzzy classifiers which combine several TSK fuzzy sub-classifiers with some aggregation strategy like weighting have been widely developed to achieve good classification performance and/or interpretability, the aggregation strategies inevitably deteriorate the interpretability of the whole ensemble and hinder them from developing the corresponding incremental algorithm easily yet efficiently for online learning. To avoid this weakness and ensure enhanced generalization capability and interpretability, this work attempts to develop a novel wide interpretable Gaussian Takagi–Sugeno–Kang (TSK) fuzzy classifier (WIG-TSK) by simultaneously training all the Gaussian TSK fuzzy sub-classifiers without any aggregation strategy, just like a single structure. Each fuzzy rule of WIG-TSK generates its Gaussian fuzzy antecedent on randomly chosen features by exerting the Gaussian random selection on the original features of data. WIG-TSK has four merits: (1) Gaussian random selection of features guarantees both its easy interpretability and the avoidance of rule-explosion curse, and hence WIG-TSK is suitable to deal with multi- and even high-dimensional data. (2) all the sub-classifiers of WIG-TSK can be simultaneously trained in parallel without any aggregation strategy, and hence the corresponding incremental learning can be developed to dynamically update the whole structure of WIG-TSK. (3) WIG-TSK has a solid theoretical guarantee to be structurally equivalent to a more complicated zero-order Gaussian TSK fuzzy classifier. (4) WIG-TSK theoretically has better generalization capability than the corresponding wide combination structure. The effectiveness of WIG-TSK is manifested by the experimental results on eighteen benchmarking datasets and a commonly-used UCI dataset in terms of classification accuracy, running speed and interpretability.
•We design a wide interpretable TSK fuzzy classifier WIG-TSK which can simultaneously train all its sub-classifier without aggregation strategy.•We provide the theoretical justifications for WIG-TSK about its enhanced generalization capability and structure equivalence.•We develop the incremental version of WIG-TSK for online learning.</description><identifier>ISSN: 0950-7051</identifier><identifier>EISSN: 1872-7409</identifier><identifier>DOI: 10.1016/j.knosys.2022.108203</identifier><language>eng</language><publisher>Amsterdam: Elsevier B.V</publisher><subject>Agglomeration ; Algorithms ; Classification ; Classifiers ; Datasets ; Distance learning ; Gaussian TSK fuzzy classifiers ; Generalization capability ; Incremental algorithm ; Interpretability ; Machine learning ; Wide combination</subject><ispartof>Knowledge-based systems, 2022-04, Vol.241, p.108203, Article 108203</ispartof><rights>2022 Elsevier B.V.</rights><rights>Copyright Elsevier Science Ltd. Apr 6, 2022</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c334t-a375fcd765d645d04a824864ac06d390909a71dcd0cc3cdace17b340405a759e3</citedby><cites>FETCH-LOGICAL-c334t-a375fcd765d645d04a824864ac06d390909a71dcd0cc3cdace17b340405a759e3</cites><orcidid>0000-0003-2182-5979</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0950705122000521$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,65306</link.rule.ids></links><search><creatorcontrib>Xie, Runshan</creatorcontrib><creatorcontrib>Wang, Shitong</creatorcontrib><title>A wide interpretable Gaussian Takagi–Sugeno–Kang fuzzy classifier and its incremental learning</title><title>Knowledge-based systems</title><description>While wide fuzzy classifiers which combine several TSK fuzzy sub-classifiers with some aggregation strategy like weighting have been widely developed to achieve good classification performance and/or interpretability, the aggregation strategies inevitably deteriorate the interpretability of the whole ensemble and hinder them from developing the corresponding incremental algorithm easily yet efficiently for online learning. To avoid this weakness and ensure enhanced generalization capability and interpretability, this work attempts to develop a novel wide interpretable Gaussian Takagi–Sugeno–Kang (TSK) fuzzy classifier (WIG-TSK) by simultaneously training all the Gaussian TSK fuzzy sub-classifiers without any aggregation strategy, just like a single structure. Each fuzzy rule of WIG-TSK generates its Gaussian fuzzy antecedent on randomly chosen features by exerting the Gaussian random selection on the original features of data. WIG-TSK has four merits: (1) Gaussian random selection of features guarantees both its easy interpretability and the avoidance of rule-explosion curse, and hence WIG-TSK is suitable to deal with multi- and even high-dimensional data. (2) all the sub-classifiers of WIG-TSK can be simultaneously trained in parallel without any aggregation strategy, and hence the corresponding incremental learning can be developed to dynamically update the whole structure of WIG-TSK. (3) WIG-TSK has a solid theoretical guarantee to be structurally equivalent to a more complicated zero-order Gaussian TSK fuzzy classifier. (4) WIG-TSK theoretically has better generalization capability than the corresponding wide combination structure. The effectiveness of WIG-TSK is manifested by the experimental results on eighteen benchmarking datasets and a commonly-used UCI dataset in terms of classification accuracy, running speed and interpretability.
•We design a wide interpretable TSK fuzzy classifier WIG-TSK which can simultaneously train all its sub-classifier without aggregation strategy.•We provide the theoretical justifications for WIG-TSK about its enhanced generalization capability and structure equivalence.•We develop the incremental version of WIG-TSK for online learning.</description><subject>Agglomeration</subject><subject>Algorithms</subject><subject>Classification</subject><subject>Classifiers</subject><subject>Datasets</subject><subject>Distance learning</subject><subject>Gaussian TSK fuzzy classifiers</subject><subject>Generalization capability</subject><subject>Incremental algorithm</subject><subject>Interpretability</subject><subject>Machine learning</subject><subject>Wide combination</subject><issn>0950-7051</issn><issn>1872-7409</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNp9kMFq3DAQhkVJoZs0b9CDIGdvRpZs2ZdCWJpN6EIPTc9iVhov2jjyRrIbNqe-Q98wTxIt7rnMYYbhm3_4f8a-CFgKEPX1fvkYhnRMyxLKMq-aEuQHthCNLgutoD1jC2grKDRU4hM7T2kPkEnRLNj2hr94R9yHkeIh0ojbnvgap5Q8Bv6Aj7jzb3_-_px2FIY8fMew4930-nrktsdMdZ4ix-C4H1OWsZGeKIzY854wBh92n9nHDvtEl__6Bft1--1hdVdsfqzvVzebwkqpxgKlrjrrdF25WlUOFDalamqFFmonW8iFWjjrwFppHVoSeisVKKhQVy3JC3Y16x7i8DxRGs1-mGLIL01Zq7KVTaNFptRM2TikFKkzh-ifMB6NAHNK0-zNnKY5pWnmNPPZ1_mMsoPf2bJJ1lOw5HwkOxo3-P8LvAMtroKj</recordid><startdate>20220406</startdate><enddate>20220406</enddate><creator>Xie, Runshan</creator><creator>Wang, Shitong</creator><general>Elsevier B.V</general><general>Elsevier Science Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>E3H</scope><scope>F2A</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0003-2182-5979</orcidid></search><sort><creationdate>20220406</creationdate><title>A wide interpretable Gaussian Takagi–Sugeno–Kang fuzzy classifier and its incremental learning</title><author>Xie, Runshan ; Wang, Shitong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c334t-a375fcd765d645d04a824864ac06d390909a71dcd0cc3cdace17b340405a759e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Agglomeration</topic><topic>Algorithms</topic><topic>Classification</topic><topic>Classifiers</topic><topic>Datasets</topic><topic>Distance learning</topic><topic>Gaussian TSK fuzzy classifiers</topic><topic>Generalization capability</topic><topic>Incremental algorithm</topic><topic>Interpretability</topic><topic>Machine learning</topic><topic>Wide combination</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Xie, Runshan</creatorcontrib><creatorcontrib>Wang, Shitong</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>Library & Information Sciences Abstracts (LISA)</collection><collection>Library & Information Science Abstracts (LISA)</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Knowledge-based systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Xie, Runshan</au><au>Wang, Shitong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A wide interpretable Gaussian Takagi–Sugeno–Kang fuzzy classifier and its incremental learning</atitle><jtitle>Knowledge-based systems</jtitle><date>2022-04-06</date><risdate>2022</risdate><volume>241</volume><spage>108203</spage><pages>108203-</pages><artnum>108203</artnum><issn>0950-7051</issn><eissn>1872-7409</eissn><abstract>While wide fuzzy classifiers which combine several TSK fuzzy sub-classifiers with some aggregation strategy like weighting have been widely developed to achieve good classification performance and/or interpretability, the aggregation strategies inevitably deteriorate the interpretability of the whole ensemble and hinder them from developing the corresponding incremental algorithm easily yet efficiently for online learning. To avoid this weakness and ensure enhanced generalization capability and interpretability, this work attempts to develop a novel wide interpretable Gaussian Takagi–Sugeno–Kang (TSK) fuzzy classifier (WIG-TSK) by simultaneously training all the Gaussian TSK fuzzy sub-classifiers without any aggregation strategy, just like a single structure. Each fuzzy rule of WIG-TSK generates its Gaussian fuzzy antecedent on randomly chosen features by exerting the Gaussian random selection on the original features of data. WIG-TSK has four merits: (1) Gaussian random selection of features guarantees both its easy interpretability and the avoidance of rule-explosion curse, and hence WIG-TSK is suitable to deal with multi- and even high-dimensional data. (2) all the sub-classifiers of WIG-TSK can be simultaneously trained in parallel without any aggregation strategy, and hence the corresponding incremental learning can be developed to dynamically update the whole structure of WIG-TSK. (3) WIG-TSK has a solid theoretical guarantee to be structurally equivalent to a more complicated zero-order Gaussian TSK fuzzy classifier. (4) WIG-TSK theoretically has better generalization capability than the corresponding wide combination structure. The effectiveness of WIG-TSK is manifested by the experimental results on eighteen benchmarking datasets and a commonly-used UCI dataset in terms of classification accuracy, running speed and interpretability.
•We design a wide interpretable TSK fuzzy classifier WIG-TSK which can simultaneously train all its sub-classifier without aggregation strategy.•We provide the theoretical justifications for WIG-TSK about its enhanced generalization capability and structure equivalence.•We develop the incremental version of WIG-TSK for online learning.</abstract><cop>Amsterdam</cop><pub>Elsevier B.V</pub><doi>10.1016/j.knosys.2022.108203</doi><orcidid>https://orcid.org/0000-0003-2182-5979</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0950-7051 |
ispartof | Knowledge-based systems, 2022-04, Vol.241, p.108203, Article 108203 |
issn | 0950-7051 1872-7409 |
language | eng |
recordid | cdi_proquest_journals_2642938871 |
source | Elsevier ScienceDirect Journals |
subjects | Agglomeration Algorithms Classification Classifiers Datasets Distance learning Gaussian TSK fuzzy classifiers Generalization capability Incremental algorithm Interpretability Machine learning Wide combination |
title | A wide interpretable Gaussian Takagi–Sugeno–Kang fuzzy classifier and its incremental learning |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-05T07%3A43%3A40IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20wide%20interpretable%20Gaussian%20Takagi%E2%80%93Sugeno%E2%80%93Kang%20fuzzy%20classifier%20and%20its%20incremental%20learning&rft.jtitle=Knowledge-based%20systems&rft.au=Xie,%20Runshan&rft.date=2022-04-06&rft.volume=241&rft.spage=108203&rft.pages=108203-&rft.artnum=108203&rft.issn=0950-7051&rft.eissn=1872-7409&rft_id=info:doi/10.1016/j.knosys.2022.108203&rft_dat=%3Cproquest_cross%3E2642938871%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2642938871&rft_id=info:pmid/&rft_els_id=S0950705122000521&rfr_iscdi=true |