Bagged Polynomial Regression and Neural Networks

Series and polynomial regression are able to approximate the same function classes as neural networks. However, these methods are rarely used in practice, although they offer more interpretability than neural networks. In this paper, we show that a potential reason for this is the slow convergence r...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Klosin, Sylvia, Vives-i-Bastida, Jaume
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Klosin, Sylvia
Vives-i-Bastida, Jaume
description Series and polynomial regression are able to approximate the same function classes as neural networks. However, these methods are rarely used in practice, although they offer more interpretability than neural networks. In this paper, we show that a potential reason for this is the slow convergence rate of polynomial regression estimators and propose the use of \textit{bagged} polynomial regression (BPR) as an attractive alternative to neural networks. Theoretically, we derive new finite sample and asymptotic $L^2$ convergence rates for series estimators. We show that the rates can be improved in smooth settings by splitting the feature space and generating polynomial features separately for each partition. Empirically, we show that our proposed estimator, the BPR, can perform as well as more complex models with more parameters. Our estimator also performs close to state-of-the-art prediction methods in the benchmark MNIST handwritten digit dataset. We demonstrate that BPR performs as well as neural networks in crop classification using satellite data, a setting where prediction accuracy is critical and interpretability is often required for addressing research questions.
doi_str_mv 10.48550/arxiv.2205.08609
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2205_08609</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2205_08609</sourcerecordid><originalsourceid>FETCH-LOGICAL-a679-f671a1513167bacb4fed1ef90aec95a2ba6f7ca503eb2012cb674ad1735d17373</originalsourceid><addsrcrecordid>eNotzsluwjAUBVBvuqiAD-iK_EDCsx3byZIiJglBVbGPnpPnKCIDchj_vmXY3CvdxdVh7ItDFCdKwQT9rbpEQoCKINGQfjL4xrKkIvjp6nvbNRXWwS-Vnvq-6toA2yLY0tn_r1s6XTt_6Ifsw2Hd0-jdA7ZfzPezVbjZLdez6SZEbdLQacORKy65NhZzGzsqOLkUkPJUobConclRgSQrgIvcahNjwY1UjzBywMav2yc5O_qqQX_PHvTsSZd_U7s-Rg</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Bagged Polynomial Regression and Neural Networks</title><source>arXiv.org</source><creator>Klosin, Sylvia ; Vives-i-Bastida, Jaume</creator><creatorcontrib>Klosin, Sylvia ; Vives-i-Bastida, Jaume</creatorcontrib><description>Series and polynomial regression are able to approximate the same function classes as neural networks. However, these methods are rarely used in practice, although they offer more interpretability than neural networks. In this paper, we show that a potential reason for this is the slow convergence rate of polynomial regression estimators and propose the use of \textit{bagged} polynomial regression (BPR) as an attractive alternative to neural networks. Theoretically, we derive new finite sample and asymptotic $L^2$ convergence rates for series estimators. We show that the rates can be improved in smooth settings by splitting the feature space and generating polynomial features separately for each partition. Empirically, we show that our proposed estimator, the BPR, can perform as well as more complex models with more parameters. Our estimator also performs close to state-of-the-art prediction methods in the benchmark MNIST handwritten digit dataset. We demonstrate that BPR performs as well as neural networks in crop classification using satellite data, a setting where prediction accuracy is critical and interpretability is often required for addressing research questions.</description><identifier>DOI: 10.48550/arxiv.2205.08609</identifier><language>eng</language><subject>Computer Science - Learning ; Statistics - Machine Learning ; Statistics - Methodology</subject><creationdate>2022-05</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2205.08609$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2205.08609$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Klosin, Sylvia</creatorcontrib><creatorcontrib>Vives-i-Bastida, Jaume</creatorcontrib><title>Bagged Polynomial Regression and Neural Networks</title><description>Series and polynomial regression are able to approximate the same function classes as neural networks. However, these methods are rarely used in practice, although they offer more interpretability than neural networks. In this paper, we show that a potential reason for this is the slow convergence rate of polynomial regression estimators and propose the use of \textit{bagged} polynomial regression (BPR) as an attractive alternative to neural networks. Theoretically, we derive new finite sample and asymptotic $L^2$ convergence rates for series estimators. We show that the rates can be improved in smooth settings by splitting the feature space and generating polynomial features separately for each partition. Empirically, we show that our proposed estimator, the BPR, can perform as well as more complex models with more parameters. Our estimator also performs close to state-of-the-art prediction methods in the benchmark MNIST handwritten digit dataset. We demonstrate that BPR performs as well as neural networks in crop classification using satellite data, a setting where prediction accuracy is critical and interpretability is often required for addressing research questions.</description><subject>Computer Science - Learning</subject><subject>Statistics - Machine Learning</subject><subject>Statistics - Methodology</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotzsluwjAUBVBvuqiAD-iK_EDCsx3byZIiJglBVbGPnpPnKCIDchj_vmXY3CvdxdVh7ItDFCdKwQT9rbpEQoCKINGQfjL4xrKkIvjp6nvbNRXWwS-Vnvq-6toA2yLY0tn_r1s6XTt_6Ifsw2Hd0-jdA7ZfzPezVbjZLdez6SZEbdLQacORKy65NhZzGzsqOLkUkPJUobConclRgSQrgIvcahNjwY1UjzBywMav2yc5O_qqQX_PHvTsSZd_U7s-Rg</recordid><startdate>20220517</startdate><enddate>20220517</enddate><creator>Klosin, Sylvia</creator><creator>Vives-i-Bastida, Jaume</creator><scope>AKY</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20220517</creationdate><title>Bagged Polynomial Regression and Neural Networks</title><author>Klosin, Sylvia ; Vives-i-Bastida, Jaume</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a679-f671a1513167bacb4fed1ef90aec95a2ba6f7ca503eb2012cb674ad1735d17373</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Computer Science - Learning</topic><topic>Statistics - Machine Learning</topic><topic>Statistics - Methodology</topic><toplevel>online_resources</toplevel><creatorcontrib>Klosin, Sylvia</creatorcontrib><creatorcontrib>Vives-i-Bastida, Jaume</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Klosin, Sylvia</au><au>Vives-i-Bastida, Jaume</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Bagged Polynomial Regression and Neural Networks</atitle><date>2022-05-17</date><risdate>2022</risdate><abstract>Series and polynomial regression are able to approximate the same function classes as neural networks. However, these methods are rarely used in practice, although they offer more interpretability than neural networks. In this paper, we show that a potential reason for this is the slow convergence rate of polynomial regression estimators and propose the use of \textit{bagged} polynomial regression (BPR) as an attractive alternative to neural networks. Theoretically, we derive new finite sample and asymptotic $L^2$ convergence rates for series estimators. We show that the rates can be improved in smooth settings by splitting the feature space and generating polynomial features separately for each partition. Empirically, we show that our proposed estimator, the BPR, can perform as well as more complex models with more parameters. Our estimator also performs close to state-of-the-art prediction methods in the benchmark MNIST handwritten digit dataset. We demonstrate that BPR performs as well as neural networks in crop classification using satellite data, a setting where prediction accuracy is critical and interpretability is often required for addressing research questions.</abstract><doi>10.48550/arxiv.2205.08609</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2205.08609
ispartof
issn
language eng
recordid cdi_arxiv_primary_2205_08609
source arXiv.org
subjects Computer Science - Learning
Statistics - Machine Learning
Statistics - Methodology
title Bagged Polynomial Regression and Neural Networks
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-22T14%3A39%3A34IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Bagged%20Polynomial%20Regression%20and%20Neural%20Networks&rft.au=Klosin,%20Sylvia&rft.date=2022-05-17&rft_id=info:doi/10.48550/arxiv.2205.08609&rft_dat=%3Carxiv_GOX%3E2205_08609%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true