Incorporating Bagging into Boosting

In classification learning, classifier group learning approach provides better results to predict accuracy. There are various classifiers to form a group by repeating single based learning algorithm. The members of the group make a final classification by voting. Boosting and Bagging are two popular...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Jain, K., Kulkarni, S.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 448
container_issue
container_start_page 443
container_title
container_volume
creator Jain, K.
Kulkarni, S.
description In classification learning, classifier group learning approach provides better results to predict accuracy. There are various classifiers to form a group by repeating single based learning algorithm. The members of the group make a final classification by voting. Boosting and Bagging are two popular methods from this group and decreases error rate of decision tree learning. Boosting is more accurate than Bagging, but the former is more variable than the later. In this paper, our aim is to review the state of the art on group learning techniques in the frame work of imbalanced data sets. We propose a new group learning algorithm called Incorporating Bagging in to Boosting (IB), which creates number of subgroups by incorporating Bagging into Boosting. Experimental results on natural domains show that on an average IB is more stable than either Bagging or Boosting. It is more stable than Boosting. These characteristics make IB a good choice as one of the group learning techniques.
doi_str_mv 10.1109/HIS.2012.6421375
format Conference Proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_6421375</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>6421375</ieee_id><sourcerecordid>6421375</sourcerecordid><originalsourceid>FETCH-LOGICAL-i90t-572d8fafa7e2500ec53486c2009eca6ffe6ea103fafdacfface8238743d712843</originalsourceid><addsrcrecordid>eNpVj81Lw0AUxLeIUKm5C14KnhPf2-8cbdE2UPBg7-WxeRtW2mxJcvG_t8VePP2YYWZghHhCqBChft02X5UElJXVEpUzM1HUzqO2ThlE4-7-aQ1zUYzjNwBc2lYa_yBemj7k4ZwHmlLfLVfUdVemfsrLVc7j1X0U95GOIxc3LsT-432_3pa7z02zftuVqYapNE62PlIkx9IAcDBKexskQM2BbIxsmRDUJdJSiJECe6m806p1KL1WC_H8N5uY-XAe0omGn8PtmfoFxRlAow</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Incorporating Bagging into Boosting</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Jain, K. ; Kulkarni, S.</creator><creatorcontrib>Jain, K. ; Kulkarni, S.</creatorcontrib><description>In classification learning, classifier group learning approach provides better results to predict accuracy. There are various classifiers to form a group by repeating single based learning algorithm. The members of the group make a final classification by voting. Boosting and Bagging are two popular methods from this group and decreases error rate of decision tree learning. Boosting is more accurate than Bagging, but the former is more variable than the later. In this paper, our aim is to review the state of the art on group learning techniques in the frame work of imbalanced data sets. We propose a new group learning algorithm called Incorporating Bagging in to Boosting (IB), which creates number of subgroups by incorporating Bagging into Boosting. Experimental results on natural domains show that on an average IB is more stable than either Bagging or Boosting. It is more stable than Boosting. These characteristics make IB a good choice as one of the group learning techniques.</description><identifier>ISBN: 9781467351140</identifier><identifier>ISBN: 1467351148</identifier><identifier>EISBN: 9781467351157</identifier><identifier>EISBN: 1467351164</identifier><identifier>EISBN: 1467351156</identifier><identifier>EISBN: 9781467351164</identifier><identifier>DOI: 10.1109/HIS.2012.6421375</identifier><language>eng</language><publisher>IEEE</publisher><subject>Bagging ; Barium ; Boosting ; Classification algorithms ; classification group learning algorithm ; Error analysis ; Radio frequency ; random forest ; Training</subject><ispartof>2012 12th International Conference on Hybrid Intelligent Systems (HIS), 2012, p.443-448</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/6421375$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,2057,27924,54919</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/6421375$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Jain, K.</creatorcontrib><creatorcontrib>Kulkarni, S.</creatorcontrib><title>Incorporating Bagging into Boosting</title><title>2012 12th International Conference on Hybrid Intelligent Systems (HIS)</title><addtitle>HIS</addtitle><description>In classification learning, classifier group learning approach provides better results to predict accuracy. There are various classifiers to form a group by repeating single based learning algorithm. The members of the group make a final classification by voting. Boosting and Bagging are two popular methods from this group and decreases error rate of decision tree learning. Boosting is more accurate than Bagging, but the former is more variable than the later. In this paper, our aim is to review the state of the art on group learning techniques in the frame work of imbalanced data sets. We propose a new group learning algorithm called Incorporating Bagging in to Boosting (IB), which creates number of subgroups by incorporating Bagging into Boosting. Experimental results on natural domains show that on an average IB is more stable than either Bagging or Boosting. It is more stable than Boosting. These characteristics make IB a good choice as one of the group learning techniques.</description><subject>Bagging</subject><subject>Barium</subject><subject>Boosting</subject><subject>Classification algorithms</subject><subject>classification group learning algorithm</subject><subject>Error analysis</subject><subject>Radio frequency</subject><subject>random forest</subject><subject>Training</subject><isbn>9781467351140</isbn><isbn>1467351148</isbn><isbn>9781467351157</isbn><isbn>1467351164</isbn><isbn>1467351156</isbn><isbn>9781467351164</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2012</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNpVj81Lw0AUxLeIUKm5C14KnhPf2-8cbdE2UPBg7-WxeRtW2mxJcvG_t8VePP2YYWZghHhCqBChft02X5UElJXVEpUzM1HUzqO2ThlE4-7-aQ1zUYzjNwBc2lYa_yBemj7k4ZwHmlLfLVfUdVemfsrLVc7j1X0U95GOIxc3LsT-432_3pa7z02zftuVqYapNE62PlIkx9IAcDBKexskQM2BbIxsmRDUJdJSiJECe6m806p1KL1WC_H8N5uY-XAe0omGn8PtmfoFxRlAow</recordid><startdate>201212</startdate><enddate>201212</enddate><creator>Jain, K.</creator><creator>Kulkarni, S.</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>201212</creationdate><title>Incorporating Bagging into Boosting</title><author>Jain, K. ; Kulkarni, S.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i90t-572d8fafa7e2500ec53486c2009eca6ffe6ea103fafdacfface8238743d712843</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2012</creationdate><topic>Bagging</topic><topic>Barium</topic><topic>Boosting</topic><topic>Classification algorithms</topic><topic>classification group learning algorithm</topic><topic>Error analysis</topic><topic>Radio frequency</topic><topic>random forest</topic><topic>Training</topic><toplevel>online_resources</toplevel><creatorcontrib>Jain, K.</creatorcontrib><creatorcontrib>Kulkarni, S.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Jain, K.</au><au>Kulkarni, S.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Incorporating Bagging into Boosting</atitle><btitle>2012 12th International Conference on Hybrid Intelligent Systems (HIS)</btitle><stitle>HIS</stitle><date>2012-12</date><risdate>2012</risdate><spage>443</spage><epage>448</epage><pages>443-448</pages><isbn>9781467351140</isbn><isbn>1467351148</isbn><eisbn>9781467351157</eisbn><eisbn>1467351164</eisbn><eisbn>1467351156</eisbn><eisbn>9781467351164</eisbn><abstract>In classification learning, classifier group learning approach provides better results to predict accuracy. There are various classifiers to form a group by repeating single based learning algorithm. The members of the group make a final classification by voting. Boosting and Bagging are two popular methods from this group and decreases error rate of decision tree learning. Boosting is more accurate than Bagging, but the former is more variable than the later. In this paper, our aim is to review the state of the art on group learning techniques in the frame work of imbalanced data sets. We propose a new group learning algorithm called Incorporating Bagging in to Boosting (IB), which creates number of subgroups by incorporating Bagging into Boosting. Experimental results on natural domains show that on an average IB is more stable than either Bagging or Boosting. It is more stable than Boosting. These characteristics make IB a good choice as one of the group learning techniques.</abstract><pub>IEEE</pub><doi>10.1109/HIS.2012.6421375</doi><tpages>6</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISBN: 9781467351140
ispartof 2012 12th International Conference on Hybrid Intelligent Systems (HIS), 2012, p.443-448
issn
language eng
recordid cdi_ieee_primary_6421375
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Bagging
Barium
Boosting
Classification algorithms
classification group learning algorithm
Error analysis
Radio frequency
random forest
Training
title Incorporating Bagging into Boosting
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T16%3A23%3A33IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Incorporating%20Bagging%20into%20Boosting&rft.btitle=2012%2012th%20International%20Conference%20on%20Hybrid%20Intelligent%20Systems%20(HIS)&rft.au=Jain,%20K.&rft.date=2012-12&rft.spage=443&rft.epage=448&rft.pages=443-448&rft.isbn=9781467351140&rft.isbn_list=1467351148&rft_id=info:doi/10.1109/HIS.2012.6421375&rft_dat=%3Cieee_6IE%3E6421375%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&rft.eisbn=9781467351157&rft.eisbn_list=1467351164&rft.eisbn_list=1467351156&rft.eisbn_list=9781467351164&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=6421375&rfr_iscdi=true