Sparsity-driven weighted ensemble classifier

In this study, a novel sparsity-driven weighted ensemble classifier (SDWEC) that improves classification accuracy and minimizes the number of classifiers is proposed. Using pre-trained classifiers, an ensemble in which base classifiers votes according to assigned weights is formed. These assigned we...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2019-06
Hauptverfasser: Ozgur, Atilla, Erdem, Hamit, Nar, Fatih
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Ozgur, Atilla
Erdem, Hamit
Nar, Fatih
description In this study, a novel sparsity-driven weighted ensemble classifier (SDWEC) that improves classification accuracy and minimizes the number of classifiers is proposed. Using pre-trained classifiers, an ensemble in which base classifiers votes according to assigned weights is formed. These assigned weights directly affect classifier accuracy. In the proposed method, ensemble weights finding problem is modeled as a cost function with the following terms: (a) a data fidelity term aiming to decrease misclassification rate, (b) a sparsity term aiming to decrease the number of classifiers, and (c) a non-negativity constraint on the weights of the classifiers. As the proposed cost function is non-convex thus hard to solve, convex relaxation techniques and novel approximations are employed to obtain a numerically efficient solution. Sparsity term of cost function allows trade-off between accuracy and testing time when needed. The efficiency of SDWEC was tested on 11 datasets and compared with the state-of-the art classifier ensemble methods. The results show that SDWEC provides better or similar accuracy levels using fewer classifiers and reduces testing time for ensemble.
doi_str_mv 10.48550/arxiv.1610.00270
format Article
fullrecord <record><control><sourceid>proquest_arxiv</sourceid><recordid>TN_cdi_arxiv_primary_1610_00270</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2080920046</sourcerecordid><originalsourceid>FETCH-LOGICAL-a526-e8a8d25bca9cb660eabd537ed480e29312ec2eacd0ef4e020a056b4a54a80f723</originalsourceid><addsrcrecordid>eNotj01Lw0AURQdBsNT-AFcG3Jr68uYjk6UUtULBhd2Hl8yLTknTOJNW---NrasLh8vlHiFuMpgrqzU8UPjxh3lmRgCAOVyICUqZpVYhXolZjBsYuclRazkR9-89heiHY-qCP3CXfLP_-BzYJdxF3lYtJ3VLMfrGc7gWlw21kWf_ORXr56f1Ypmu3l5eF4-rlDSalC1Zh7qqqagrY4Cpclrm7JQFxkJmyDUy1Q64UQwIBNpUirQiC02Ocipuz7MnlbIPfkvhWP4plSelsXF3bvRh97XnOJSb3T5046cSwUKBAMrIXxryTnY</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2080920046</pqid></control><display><type>article</type><title>Sparsity-driven weighted ensemble classifier</title><source>arXiv.org</source><source>Free E- Journals</source><creator>Ozgur, Atilla ; Erdem, Hamit ; Nar, Fatih</creator><creatorcontrib>Ozgur, Atilla ; Erdem, Hamit ; Nar, Fatih</creatorcontrib><description>In this study, a novel sparsity-driven weighted ensemble classifier (SDWEC) that improves classification accuracy and minimizes the number of classifiers is proposed. Using pre-trained classifiers, an ensemble in which base classifiers votes according to assigned weights is formed. These assigned weights directly affect classifier accuracy. In the proposed method, ensemble weights finding problem is modeled as a cost function with the following terms: (a) a data fidelity term aiming to decrease misclassification rate, (b) a sparsity term aiming to decrease the number of classifiers, and (c) a non-negativity constraint on the weights of the classifiers. As the proposed cost function is non-convex thus hard to solve, convex relaxation techniques and novel approximations are employed to obtain a numerically efficient solution. Sparsity term of cost function allows trade-off between accuracy and testing time when needed. The efficiency of SDWEC was tested on 11 datasets and compared with the state-of-the art classifier ensemble methods. The results show that SDWEC provides better or similar accuracy levels using fewer classifiers and reduces testing time for ensemble.</description><identifier>EISSN: 2331-8422</identifier><identifier>DOI: 10.48550/arxiv.1610.00270</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Accuracy ; Classifiers ; Computer Science - Learning ; Cost function ; Mathematical models ; Sparsity ; Statistics - Machine Learning ; Weight</subject><ispartof>arXiv.org, 2019-06</ispartof><rights>2019. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,784,885,27925</link.rule.ids><backlink>$$Uhttps://doi.org/10.2991/ijcis.11.1.73$$DView published paper (Access to full text may be restricted)$$Hfree_for_read</backlink><backlink>$$Uhttps://doi.org/10.48550/arXiv.1610.00270$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Ozgur, Atilla</creatorcontrib><creatorcontrib>Erdem, Hamit</creatorcontrib><creatorcontrib>Nar, Fatih</creatorcontrib><title>Sparsity-driven weighted ensemble classifier</title><title>arXiv.org</title><description>In this study, a novel sparsity-driven weighted ensemble classifier (SDWEC) that improves classification accuracy and minimizes the number of classifiers is proposed. Using pre-trained classifiers, an ensemble in which base classifiers votes according to assigned weights is formed. These assigned weights directly affect classifier accuracy. In the proposed method, ensemble weights finding problem is modeled as a cost function with the following terms: (a) a data fidelity term aiming to decrease misclassification rate, (b) a sparsity term aiming to decrease the number of classifiers, and (c) a non-negativity constraint on the weights of the classifiers. As the proposed cost function is non-convex thus hard to solve, convex relaxation techniques and novel approximations are employed to obtain a numerically efficient solution. Sparsity term of cost function allows trade-off between accuracy and testing time when needed. The efficiency of SDWEC was tested on 11 datasets and compared with the state-of-the art classifier ensemble methods. The results show that SDWEC provides better or similar accuracy levels using fewer classifiers and reduces testing time for ensemble.</description><subject>Accuracy</subject><subject>Classifiers</subject><subject>Computer Science - Learning</subject><subject>Cost function</subject><subject>Mathematical models</subject><subject>Sparsity</subject><subject>Statistics - Machine Learning</subject><subject>Weight</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GOX</sourceid><recordid>eNotj01Lw0AURQdBsNT-AFcG3Jr68uYjk6UUtULBhd2Hl8yLTknTOJNW---NrasLh8vlHiFuMpgrqzU8UPjxh3lmRgCAOVyICUqZpVYhXolZjBsYuclRazkR9-89heiHY-qCP3CXfLP_-BzYJdxF3lYtJ3VLMfrGc7gWlw21kWf_ORXr56f1Ypmu3l5eF4-rlDSalC1Zh7qqqagrY4Cpclrm7JQFxkJmyDUy1Q64UQwIBNpUirQiC02Ocipuz7MnlbIPfkvhWP4plSelsXF3bvRh97XnOJSb3T5046cSwUKBAMrIXxryTnY</recordid><startdate>20190621</startdate><enddate>20190621</enddate><creator>Ozgur, Atilla</creator><creator>Erdem, Hamit</creator><creator>Nar, Fatih</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>AKY</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20190621</creationdate><title>Sparsity-driven weighted ensemble classifier</title><author>Ozgur, Atilla ; Erdem, Hamit ; Nar, Fatih</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a526-e8a8d25bca9cb660eabd537ed480e29312ec2eacd0ef4e020a056b4a54a80f723</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Accuracy</topic><topic>Classifiers</topic><topic>Computer Science - Learning</topic><topic>Cost function</topic><topic>Mathematical models</topic><topic>Sparsity</topic><topic>Statistics - Machine Learning</topic><topic>Weight</topic><toplevel>online_resources</toplevel><creatorcontrib>Ozgur, Atilla</creatorcontrib><creatorcontrib>Erdem, Hamit</creatorcontrib><creatorcontrib>Nar, Fatih</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>arXiv Computer Science</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection><jtitle>arXiv.org</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ozgur, Atilla</au><au>Erdem, Hamit</au><au>Nar, Fatih</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Sparsity-driven weighted ensemble classifier</atitle><jtitle>arXiv.org</jtitle><date>2019-06-21</date><risdate>2019</risdate><eissn>2331-8422</eissn><abstract>In this study, a novel sparsity-driven weighted ensemble classifier (SDWEC) that improves classification accuracy and minimizes the number of classifiers is proposed. Using pre-trained classifiers, an ensemble in which base classifiers votes according to assigned weights is formed. These assigned weights directly affect classifier accuracy. In the proposed method, ensemble weights finding problem is modeled as a cost function with the following terms: (a) a data fidelity term aiming to decrease misclassification rate, (b) a sparsity term aiming to decrease the number of classifiers, and (c) a non-negativity constraint on the weights of the classifiers. As the proposed cost function is non-convex thus hard to solve, convex relaxation techniques and novel approximations are employed to obtain a numerically efficient solution. Sparsity term of cost function allows trade-off between accuracy and testing time when needed. The efficiency of SDWEC was tested on 11 datasets and compared with the state-of-the art classifier ensemble methods. The results show that SDWEC provides better or similar accuracy levels using fewer classifiers and reduces testing time for ensemble.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><doi>10.48550/arxiv.1610.00270</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2019-06
issn 2331-8422
language eng
recordid cdi_arxiv_primary_1610_00270
source arXiv.org; Free E- Journals
subjects Accuracy
Classifiers
Computer Science - Learning
Cost function
Mathematical models
Sparsity
Statistics - Machine Learning
Weight
title Sparsity-driven weighted ensemble classifier
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T15%3A37%3A09IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_arxiv&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Sparsity-driven%20weighted%20ensemble%20classifier&rft.jtitle=arXiv.org&rft.au=Ozgur,%20Atilla&rft.date=2019-06-21&rft.eissn=2331-8422&rft_id=info:doi/10.48550/arxiv.1610.00270&rft_dat=%3Cproquest_arxiv%3E2080920046%3C/proquest_arxiv%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2080920046&rft_id=info:pmid/&rfr_iscdi=true