Ensembles of Nearest Neighbor Forecasts
Nearest neighbor forecasting models are attractive with their simplicity and the ability to predict complex nonlinear behavior. They rely on the assumption that observations similar to the target one are also likely to have similar outcomes. A common practice in nearest neighbor model selection is t...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Buchkapitel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 556 |
---|---|
container_issue | |
container_start_page | 545 |
container_title | |
container_volume | |
creator | Yankov, Dragomir DeCoste, Dennis Keogh, Eamonn |
description | Nearest neighbor forecasting models are attractive with their simplicity and the ability to predict complex nonlinear behavior. They rely on the assumption that observations similar to the target one are also likely to have similar outcomes. A common practice in nearest neighbor model selection is to compute the globally optimal number of neighbors on a validation set, which is later applied for all incoming queries. For certain queries, however, this number may be suboptimal and forecasts that deviate a lot from the true realization could be produced.
To address the problem we propose an alternative approach of training ensembles of nearest neighbor predictors that determine the best number of neighbors for individual queries. We demonstrate that the forecasts of the ensembles improve significantly on the globally optimal single predictors. |
doi_str_mv | 10.1007/11871842_51 |
format | Book Chapter |
fullrecord | <record><control><sourceid>pascalfrancis_sprin</sourceid><recordid>TN_cdi_pascalfrancis_primary_19910394</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>19910394</sourcerecordid><originalsourceid>FETCH-LOGICAL-c298t-77b82f26ea5edcd6eefae3868746fcf2404b41c8882ca9960613792fc29def7d3</originalsourceid><addsrcrecordid>eNpNUD1PwzAUNF8SUcnEH8iCEEPAz98eUdUCUgULzJbjPJdA21R2Fv49RkWCW264e3dPR8gl0FugVN8BGA1GMCfhiNRWGy4FFYpKJY9JBQqg5VzYkz9Nci3NKakop6y1WvBzUuf8QQt4CROsIteLXcZtt8HcjLF5Rp8wT4WH9Xs3pmY5Jgw-T_mCnEW_yVj_8oy8LRev88d29fLwNL9ftYFZM7Vad4ZFptBL7EOvEKNHblQpUzFEVp7qBARjDAveWkUVcG1ZLNc9Rt3zGbk65O59Dn4Tk9-FIbt9GrY-fTmwFii3ovhuDr5cpN0ak-vG8TM7oO5nLPdvLP4NH41U3A</addsrcrecordid><sourcetype>Index Database</sourcetype><iscdi>true</iscdi><recordtype>book_chapter</recordtype></control><display><type>book_chapter</type><title>Ensembles of Nearest Neighbor Forecasts</title><source>Springer Books</source><creator>Yankov, Dragomir ; DeCoste, Dennis ; Keogh, Eamonn</creator><contributor>Fürnkranz, Johannes ; Spiliopoulou, Myra ; Scheffer, Tobias</contributor><creatorcontrib>Yankov, Dragomir ; DeCoste, Dennis ; Keogh, Eamonn ; Fürnkranz, Johannes ; Spiliopoulou, Myra ; Scheffer, Tobias</creatorcontrib><description>Nearest neighbor forecasting models are attractive with their simplicity and the ability to predict complex nonlinear behavior. They rely on the assumption that observations similar to the target one are also likely to have similar outcomes. A common practice in nearest neighbor model selection is to compute the globally optimal number of neighbors on a validation set, which is later applied for all incoming queries. For certain queries, however, this number may be suboptimal and forecasts that deviate a lot from the true realization could be produced.
To address the problem we propose an alternative approach of training ensembles of nearest neighbor predictors that determine the best number of neighbors for individual queries. We demonstrate that the forecasts of the ensembles improve significantly on the globally optimal single predictors.</description><identifier>ISSN: 0302-9743</identifier><identifier>ISBN: 9783540453758</identifier><identifier>ISBN: 354045375X</identifier><identifier>EISSN: 1611-3349</identifier><identifier>EISBN: 9783540460565</identifier><identifier>EISBN: 354046056X</identifier><identifier>DOI: 10.1007/11871842_51</identifier><language>eng</language><publisher>Berlin, Heidelberg: Springer Berlin Heidelberg</publisher><subject>Applied sciences ; Artificial intelligence ; Chaotic Time Series ; Computer science; control theory; systems ; Exact sciences and technology ; Good Single Predictor ; Information systems. Data bases ; Memory organisation. Data processing ; Root Mean Square Error ; Single Predictor ; Software ; Time Series Prediction</subject><ispartof>Lecture notes in computer science, 2006, p.545-556</ispartof><rights>Springer-Verlag Berlin Heidelberg 2006</rights><rights>2008 INIST-CNRS</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c298t-77b82f26ea5edcd6eefae3868746fcf2404b41c8882ca9960613792fc29def7d3</citedby><relation>Lecture Notes in Computer Science</relation></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/11871842_51$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/11871842_51$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>309,310,775,776,780,785,786,789,4035,4036,27904,38234,41421,42490</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=19910394$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><contributor>Fürnkranz, Johannes</contributor><contributor>Spiliopoulou, Myra</contributor><contributor>Scheffer, Tobias</contributor><creatorcontrib>Yankov, Dragomir</creatorcontrib><creatorcontrib>DeCoste, Dennis</creatorcontrib><creatorcontrib>Keogh, Eamonn</creatorcontrib><title>Ensembles of Nearest Neighbor Forecasts</title><title>Lecture notes in computer science</title><description>Nearest neighbor forecasting models are attractive with their simplicity and the ability to predict complex nonlinear behavior. They rely on the assumption that observations similar to the target one are also likely to have similar outcomes. A common practice in nearest neighbor model selection is to compute the globally optimal number of neighbors on a validation set, which is later applied for all incoming queries. For certain queries, however, this number may be suboptimal and forecasts that deviate a lot from the true realization could be produced.
To address the problem we propose an alternative approach of training ensembles of nearest neighbor predictors that determine the best number of neighbors for individual queries. We demonstrate that the forecasts of the ensembles improve significantly on the globally optimal single predictors.</description><subject>Applied sciences</subject><subject>Artificial intelligence</subject><subject>Chaotic Time Series</subject><subject>Computer science; control theory; systems</subject><subject>Exact sciences and technology</subject><subject>Good Single Predictor</subject><subject>Information systems. Data bases</subject><subject>Memory organisation. Data processing</subject><subject>Root Mean Square Error</subject><subject>Single Predictor</subject><subject>Software</subject><subject>Time Series Prediction</subject><issn>0302-9743</issn><issn>1611-3349</issn><isbn>9783540453758</isbn><isbn>354045375X</isbn><isbn>9783540460565</isbn><isbn>354046056X</isbn><fulltext>true</fulltext><rsrctype>book_chapter</rsrctype><creationdate>2006</creationdate><recordtype>book_chapter</recordtype><recordid>eNpNUD1PwzAUNF8SUcnEH8iCEEPAz98eUdUCUgULzJbjPJdA21R2Fv49RkWCW264e3dPR8gl0FugVN8BGA1GMCfhiNRWGy4FFYpKJY9JBQqg5VzYkz9Nci3NKakop6y1WvBzUuf8QQt4CROsIteLXcZtt8HcjLF5Rp8wT4WH9Xs3pmY5Jgw-T_mCnEW_yVj_8oy8LRev88d29fLwNL9ftYFZM7Vad4ZFptBL7EOvEKNHblQpUzFEVp7qBARjDAveWkUVcG1ZLNc9Rt3zGbk65O59Dn4Tk9-FIbt9GrY-fTmwFii3ovhuDr5cpN0ak-vG8TM7oO5nLPdvLP4NH41U3A</recordid><startdate>2006</startdate><enddate>2006</enddate><creator>Yankov, Dragomir</creator><creator>DeCoste, Dennis</creator><creator>Keogh, Eamonn</creator><general>Springer Berlin Heidelberg</general><general>Springer</general><scope>IQODW</scope></search><sort><creationdate>2006</creationdate><title>Ensembles of Nearest Neighbor Forecasts</title><author>Yankov, Dragomir ; DeCoste, Dennis ; Keogh, Eamonn</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c298t-77b82f26ea5edcd6eefae3868746fcf2404b41c8882ca9960613792fc29def7d3</frbrgroupid><rsrctype>book_chapters</rsrctype><prefilter>book_chapters</prefilter><language>eng</language><creationdate>2006</creationdate><topic>Applied sciences</topic><topic>Artificial intelligence</topic><topic>Chaotic Time Series</topic><topic>Computer science; control theory; systems</topic><topic>Exact sciences and technology</topic><topic>Good Single Predictor</topic><topic>Information systems. Data bases</topic><topic>Memory organisation. Data processing</topic><topic>Root Mean Square Error</topic><topic>Single Predictor</topic><topic>Software</topic><topic>Time Series Prediction</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Yankov, Dragomir</creatorcontrib><creatorcontrib>DeCoste, Dennis</creatorcontrib><creatorcontrib>Keogh, Eamonn</creatorcontrib><collection>Pascal-Francis</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Yankov, Dragomir</au><au>DeCoste, Dennis</au><au>Keogh, Eamonn</au><au>Fürnkranz, Johannes</au><au>Spiliopoulou, Myra</au><au>Scheffer, Tobias</au><format>book</format><genre>bookitem</genre><ristype>CHAP</ristype><atitle>Ensembles of Nearest Neighbor Forecasts</atitle><btitle>Lecture notes in computer science</btitle><seriestitle>Lecture Notes in Computer Science</seriestitle><date>2006</date><risdate>2006</risdate><spage>545</spage><epage>556</epage><pages>545-556</pages><issn>0302-9743</issn><eissn>1611-3349</eissn><isbn>9783540453758</isbn><isbn>354045375X</isbn><eisbn>9783540460565</eisbn><eisbn>354046056X</eisbn><abstract>Nearest neighbor forecasting models are attractive with their simplicity and the ability to predict complex nonlinear behavior. They rely on the assumption that observations similar to the target one are also likely to have similar outcomes. A common practice in nearest neighbor model selection is to compute the globally optimal number of neighbors on a validation set, which is later applied for all incoming queries. For certain queries, however, this number may be suboptimal and forecasts that deviate a lot from the true realization could be produced.
To address the problem we propose an alternative approach of training ensembles of nearest neighbor predictors that determine the best number of neighbors for individual queries. We demonstrate that the forecasts of the ensembles improve significantly on the globally optimal single predictors.</abstract><cop>Berlin, Heidelberg</cop><pub>Springer Berlin Heidelberg</pub><doi>10.1007/11871842_51</doi><tpages>12</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0302-9743 |
ispartof | Lecture notes in computer science, 2006, p.545-556 |
issn | 0302-9743 1611-3349 |
language | eng |
recordid | cdi_pascalfrancis_primary_19910394 |
source | Springer Books |
subjects | Applied sciences Artificial intelligence Chaotic Time Series Computer science control theory systems Exact sciences and technology Good Single Predictor Information systems. Data bases Memory organisation. Data processing Root Mean Square Error Single Predictor Software Time Series Prediction |
title | Ensembles of Nearest Neighbor Forecasts |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-25T17%3A06%3A25IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-pascalfrancis_sprin&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=bookitem&rft.atitle=Ensembles%20of%20Nearest%20Neighbor%20Forecasts&rft.btitle=Lecture%20notes%20in%20computer%20science&rft.au=Yankov,%20Dragomir&rft.date=2006&rft.spage=545&rft.epage=556&rft.pages=545-556&rft.issn=0302-9743&rft.eissn=1611-3349&rft.isbn=9783540453758&rft.isbn_list=354045375X&rft_id=info:doi/10.1007/11871842_51&rft_dat=%3Cpascalfrancis_sprin%3E19910394%3C/pascalfrancis_sprin%3E%3Curl%3E%3C/url%3E&rft.eisbn=9783540460565&rft.eisbn_list=354046056X&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |