Multistep sequential exploration of growing Bayesian classification models

If the collection of training data is costly, one can gain by actively selecting particular informative data points in a sequential way. In a Bayesian decision theoretic framework we develop a query selection criterion for classification models which explicitly takes into account the utility of deci...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Paass, C., Kindermann, J.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 571 vol.3
container_issue
container_start_page 566
container_title
container_volume 3
creator Paass, C.
Kindermann, J.
description If the collection of training data is costly, one can gain by actively selecting particular informative data points in a sequential way. In a Bayesian decision theoretic framework we develop a query selection criterion for classification models which explicitly takes into account the utility of decisions. We determine the overall utility and its derivative with respect to changes of the queries. An optimal query now may be obtained by stochastic hill climbing. Simultaneously, the model structure can be adapted by reversible jump Markov chain Monte Carlo.
doi_str_mv 10.1109/IJCNN.2000.861371
format Conference Proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_861371</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>861371</ieee_id><sourcerecordid>861371</sourcerecordid><originalsourceid>FETCH-LOGICAL-i172t-2ed35c36371b96952ca521e864054373241c138d94d8599453ff52c47ded50173</originalsourceid><addsrcrecordid>eNotkMlOwzAURS0GiarkA2CVH0jw8-wlRAytStnAujLxS2XkJiFOBf17IoXVXdxBR5eQG6AlALV3q3W13ZaMUloaBVzDGVmAlKbglrJzklltqFZWUgVWX0wetabQUqsrkqX0NfWAcqkYLMj69RjHkEbs84TfR2zH4GKOv33sBjeGrs27Jt8P3U9o9_mDO2EKrs3r6FIKTajnyKHzGNM1uWxcTJj965J8PD2-Vy_F5u15Vd1vigCajQVDz2XN1YT9aSdIVjvJAI0SVAquORNQAzfeCm-ktULypplCQnv0koLmS3I77wZE3PVDOLjhtJuP4H9ic09L</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Multistep sequential exploration of growing Bayesian classification models</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Paass, C. ; Kindermann, J.</creator><creatorcontrib>Paass, C. ; Kindermann, J.</creatorcontrib><description>If the collection of training data is costly, one can gain by actively selecting particular informative data points in a sequential way. In a Bayesian decision theoretic framework we develop a query selection criterion for classification models which explicitly takes into account the utility of decisions. We determine the overall utility and its derivative with respect to changes of the queries. An optimal query now may be obtained by stochastic hill climbing. Simultaneously, the model structure can be adapted by reversible jump Markov chain Monte Carlo.</description><identifier>ISSN: 1098-7576</identifier><identifier>ISBN: 9780769506197</identifier><identifier>ISBN: 0769506194</identifier><identifier>EISSN: 1558-3902</identifier><identifier>DOI: 10.1109/IJCNN.2000.861371</identifier><language>eng</language><publisher>IEEE</publisher><subject>Bayesian methods ; Design for experiments ; Laboratories ; Monte Carlo methods ; Neural networks ; Sampling methods ; Simulated annealing ; Stochastic processes ; Training data ; Utility theory</subject><ispartof>Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium, 2000, Vol.3, p.566-571 vol.3</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/861371$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,777,781,786,787,2052,4036,4037,27906,54901</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/861371$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Paass, C.</creatorcontrib><creatorcontrib>Kindermann, J.</creatorcontrib><title>Multistep sequential exploration of growing Bayesian classification models</title><title>Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium</title><addtitle>IJCNN</addtitle><description>If the collection of training data is costly, one can gain by actively selecting particular informative data points in a sequential way. In a Bayesian decision theoretic framework we develop a query selection criterion for classification models which explicitly takes into account the utility of decisions. We determine the overall utility and its derivative with respect to changes of the queries. An optimal query now may be obtained by stochastic hill climbing. Simultaneously, the model structure can be adapted by reversible jump Markov chain Monte Carlo.</description><subject>Bayesian methods</subject><subject>Design for experiments</subject><subject>Laboratories</subject><subject>Monte Carlo methods</subject><subject>Neural networks</subject><subject>Sampling methods</subject><subject>Simulated annealing</subject><subject>Stochastic processes</subject><subject>Training data</subject><subject>Utility theory</subject><issn>1098-7576</issn><issn>1558-3902</issn><isbn>9780769506197</isbn><isbn>0769506194</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2000</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNotkMlOwzAURS0GiarkA2CVH0jw8-wlRAytStnAujLxS2XkJiFOBf17IoXVXdxBR5eQG6AlALV3q3W13ZaMUloaBVzDGVmAlKbglrJzklltqFZWUgVWX0wetabQUqsrkqX0NfWAcqkYLMj69RjHkEbs84TfR2zH4GKOv33sBjeGrs27Jt8P3U9o9_mDO2EKrs3r6FIKTajnyKHzGNM1uWxcTJj965J8PD2-Vy_F5u15Vd1vigCajQVDz2XN1YT9aSdIVjvJAI0SVAquORNQAzfeCm-ktULypplCQnv0koLmS3I77wZE3PVDOLjhtJuP4H9ic09L</recordid><startdate>2000</startdate><enddate>2000</enddate><creator>Paass, C.</creator><creator>Kindermann, J.</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope></search><sort><creationdate>2000</creationdate><title>Multistep sequential exploration of growing Bayesian classification models</title><author>Paass, C. ; Kindermann, J.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i172t-2ed35c36371b96952ca521e864054373241c138d94d8599453ff52c47ded50173</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2000</creationdate><topic>Bayesian methods</topic><topic>Design for experiments</topic><topic>Laboratories</topic><topic>Monte Carlo methods</topic><topic>Neural networks</topic><topic>Sampling methods</topic><topic>Simulated annealing</topic><topic>Stochastic processes</topic><topic>Training data</topic><topic>Utility theory</topic><toplevel>online_resources</toplevel><creatorcontrib>Paass, C.</creatorcontrib><creatorcontrib>Kindermann, J.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Paass, C.</au><au>Kindermann, J.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Multistep sequential exploration of growing Bayesian classification models</atitle><btitle>Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium</btitle><stitle>IJCNN</stitle><date>2000</date><risdate>2000</risdate><volume>3</volume><spage>566</spage><epage>571 vol.3</epage><pages>566-571 vol.3</pages><issn>1098-7576</issn><eissn>1558-3902</eissn><isbn>9780769506197</isbn><isbn>0769506194</isbn><abstract>If the collection of training data is costly, one can gain by actively selecting particular informative data points in a sequential way. In a Bayesian decision theoretic framework we develop a query selection criterion for classification models which explicitly takes into account the utility of decisions. We determine the overall utility and its derivative with respect to changes of the queries. An optimal query now may be obtained by stochastic hill climbing. Simultaneously, the model structure can be adapted by reversible jump Markov chain Monte Carlo.</abstract><pub>IEEE</pub><doi>10.1109/IJCNN.2000.861371</doi></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1098-7576
ispartof Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium, 2000, Vol.3, p.566-571 vol.3
issn 1098-7576
1558-3902
language eng
recordid cdi_ieee_primary_861371
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Bayesian methods
Design for experiments
Laboratories
Monte Carlo methods
Neural networks
Sampling methods
Simulated annealing
Stochastic processes
Training data
Utility theory
title Multistep sequential exploration of growing Bayesian classification models
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-20T00%3A32%3A11IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Multistep%20sequential%20exploration%20of%20growing%20Bayesian%20classification%20models&rft.btitle=Proceedings%20of%20the%20IEEE-INNS-ENNS%20International%20Joint%20Conference%20on%20Neural%20Networks.%20IJCNN%202000.%20Neural%20Computing:%20New%20Challenges%20and%20Perspectives%20for%20the%20New%20Millennium&rft.au=Paass,%20C.&rft.date=2000&rft.volume=3&rft.spage=566&rft.epage=571%20vol.3&rft.pages=566-571%20vol.3&rft.issn=1098-7576&rft.eissn=1558-3902&rft.isbn=9780769506197&rft.isbn_list=0769506194&rft_id=info:doi/10.1109/IJCNN.2000.861371&rft_dat=%3Cieee_6IE%3E861371%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=861371&rfr_iscdi=true