Improving ensemble decision tree performance using Adaboost and Bagging

Ensemble classifier systems are considered as one of the most promising in medical data classification and the performance of deceision tree classifier can be increased by the ensemble method as it is proven to be better than single classifiers. However, in a ensemble settings the performance depend...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 1
container_start_page
container_title
container_volume 1691
description Ensemble classifier systems are considered as one of the most promising in medical data classification and the performance of deceision tree classifier can be increased by the ensemble method as it is proven to be better than single classifiers. However, in a ensemble settings the performance depends on the selection of suitable base classifier. This research employed two prominent esemble s namely Adaboost and Bagging with base classifiers such as Random Forest, Random Tree, j48, j48grafts and Logistic Model Regression (LMT) that have been selected independently. The empirical study shows that the performance varries when different base classifiers are selected and even some places overfitting issue also been noted. The evidence shows that ensemble decision tree classfiers using Adaboost and Bagging improves the performance of selected medical data sets.
doi_str_mv 10.1063/1.4937027
format Conference Proceeding
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2123768914</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2123768914</sourcerecordid><originalsourceid>FETCH-LOGICAL-p183t-b7fbbd55cfc9ed470b5c698c23fbbf43083163222a4f599d81a8149c21f3545c3</originalsourceid><addsrcrecordid>eNotjktLxDAYRYMoWEcX_oOA64758miS5TjoODDgRsHdkGfpME1q0_r7rejqwuFy7kXoHsgaSMMeYc01k4TKC1SBEFDLBppLVBGieU05-7xGN6WcCKFaSlWh3b4fxvzdpRaHVEJvzwH74LrS5YSnMQQ8hDHmsTfJBTyX3-LGG5tzmbBJHj-Ztl3gLbqK5lzC3X-u0MfL8_v2tT687fbbzaEeQLGptjJa64Vw0enguSRWuEYrR9nCI2dEMWgYpdTwKLT2CowCrh2FyAQXjq3Qw593ef01hzIdT3ke0zJ5pECZbJQGzn4ANJpMnA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype><pqid>2123768914</pqid></control><display><type>conference_proceeding</type><title>Improving ensemble decision tree performance using Adaboost and Bagging</title><source>AIP Journals Complete</source><description>Ensemble classifier systems are considered as one of the most promising in medical data classification and the performance of deceision tree classifier can be increased by the ensemble method as it is proven to be better than single classifiers. However, in a ensemble settings the performance depends on the selection of suitable base classifier. This research employed two prominent esemble s namely Adaboost and Bagging with base classifiers such as Random Forest, Random Tree, j48, j48grafts and Logistic Model Regression (LMT) that have been selected independently. The empirical study shows that the performance varries when different base classifiers are selected and even some places overfitting issue also been noted. The evidence shows that ensemble decision tree classfiers using Adaboost and Bagging improves the performance of selected medical data sets.</description><identifier>ISSN: 0094-243X</identifier><identifier>EISSN: 1551-7616</identifier><identifier>DOI: 10.1063/1.4937027</identifier><language>eng</language><publisher>Melville: American Institute of Physics</publisher><subject>Bagging ; Classifiers ; Decision trees ; Machine learning ; Performance enhancement ; Regression models</subject><ispartof>AIP conference proceedings, 2015, Vol.1691 (1)</ispartof><rights>2015 AIP Publishing LLC.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>309,310,776,780,785,786,23909,23910,25118,27902</link.rule.ids></links><search><title>Improving ensemble decision tree performance using Adaboost and Bagging</title><title>AIP conference proceedings</title><description>Ensemble classifier systems are considered as one of the most promising in medical data classification and the performance of deceision tree classifier can be increased by the ensemble method as it is proven to be better than single classifiers. However, in a ensemble settings the performance depends on the selection of suitable base classifier. This research employed two prominent esemble s namely Adaboost and Bagging with base classifiers such as Random Forest, Random Tree, j48, j48grafts and Logistic Model Regression (LMT) that have been selected independently. The empirical study shows that the performance varries when different base classifiers are selected and even some places overfitting issue also been noted. The evidence shows that ensemble decision tree classfiers using Adaboost and Bagging improves the performance of selected medical data sets.</description><subject>Bagging</subject><subject>Classifiers</subject><subject>Decision trees</subject><subject>Machine learning</subject><subject>Performance enhancement</subject><subject>Regression models</subject><issn>0094-243X</issn><issn>1551-7616</issn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2015</creationdate><recordtype>conference_proceeding</recordtype><recordid>eNotjktLxDAYRYMoWEcX_oOA64758miS5TjoODDgRsHdkGfpME1q0_r7rejqwuFy7kXoHsgaSMMeYc01k4TKC1SBEFDLBppLVBGieU05-7xGN6WcCKFaSlWh3b4fxvzdpRaHVEJvzwH74LrS5YSnMQQ8hDHmsTfJBTyX3-LGG5tzmbBJHj-Ztl3gLbqK5lzC3X-u0MfL8_v2tT687fbbzaEeQLGptjJa64Vw0enguSRWuEYrR9nCI2dEMWgYpdTwKLT2CowCrh2FyAQXjq3Qw593ef01hzIdT3ke0zJ5pECZbJQGzn4ANJpMnA</recordid><startdate>20151211</startdate><enddate>20151211</enddate><general>American Institute of Physics</general><scope>8FD</scope><scope>H8D</scope><scope>L7M</scope></search><sort><creationdate>20151211</creationdate><title>Improving ensemble decision tree performance using Adaboost and Bagging</title></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-p183t-b7fbbd55cfc9ed470b5c698c23fbbf43083163222a4f599d81a8149c21f3545c3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2015</creationdate><topic>Bagging</topic><topic>Classifiers</topic><topic>Decision trees</topic><topic>Machine learning</topic><topic>Performance enhancement</topic><topic>Regression models</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><collection>Technology Research Database</collection><collection>Aerospace Database</collection><collection>Advanced Technologies Database with Aerospace</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Improving ensemble decision tree performance using Adaboost and Bagging</atitle><btitle>AIP conference proceedings</btitle><date>2015-12-11</date><risdate>2015</risdate><volume>1691</volume><issue>1</issue><issn>0094-243X</issn><eissn>1551-7616</eissn><abstract>Ensemble classifier systems are considered as one of the most promising in medical data classification and the performance of deceision tree classifier can be increased by the ensemble method as it is proven to be better than single classifiers. However, in a ensemble settings the performance depends on the selection of suitable base classifier. This research employed two prominent esemble s namely Adaboost and Bagging with base classifiers such as Random Forest, Random Tree, j48, j48grafts and Logistic Model Regression (LMT) that have been selected independently. The empirical study shows that the performance varries when different base classifiers are selected and even some places overfitting issue also been noted. The evidence shows that ensemble decision tree classfiers using Adaboost and Bagging improves the performance of selected medical data sets.</abstract><cop>Melville</cop><pub>American Institute of Physics</pub><doi>10.1063/1.4937027</doi></addata></record>
fulltext fulltext
identifier ISSN: 0094-243X
ispartof AIP conference proceedings, 2015, Vol.1691 (1)
issn 0094-243X
1551-7616
language eng
recordid cdi_proquest_journals_2123768914
source AIP Journals Complete
subjects Bagging
Classifiers
Decision trees
Machine learning
Performance enhancement
Regression models
title Improving ensemble decision tree performance using Adaboost and Bagging
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-07T21%3A34%3A44IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Improving%20ensemble%20decision%20tree%20performance%20using%20Adaboost%20and%20Bagging&rft.btitle=AIP%20conference%20proceedings&rft.date=2015-12-11&rft.volume=1691&rft.issue=1&rft.issn=0094-243X&rft.eissn=1551-7616&rft_id=info:doi/10.1063/1.4937027&rft_dat=%3Cproquest%3E2123768914%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2123768914&rft_id=info:pmid/&rfr_iscdi=true