Simultaneous approximation of a smooth function and its derivatives by deep neural networks with piecewise-polynomial activations

This paper investigates the approximation properties of deep neural networks with piecewise-polynomial activation functions. We derive the required depth, width, and sparsity of a deep neural network to approximate any Hölder smooth function up to a given approximation error in Hölder norms in such...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural networks 2023-04, Vol.161, p.242-253
Hauptverfasser: Belomestny, Denis, Naumov, Alexey, Puchkin, Nikita, Samsonov, Sergey
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 253
container_issue
container_start_page 242
container_title Neural networks
container_volume 161
creator Belomestny, Denis
Naumov, Alexey
Puchkin, Nikita
Samsonov, Sergey
description This paper investigates the approximation properties of deep neural networks with piecewise-polynomial activation functions. We derive the required depth, width, and sparsity of a deep neural network to approximate any Hölder smooth function up to a given approximation error in Hölder norms in such a way that all weights of this neural network are bounded by 1. The latter feature is essential to control generalization errors in many statistical and machine learning applications. •Rates and complexity for smooth function approximation in Hölder norms by ReQU neural networks.•Explicit and uniform bounds for weights of the approximating neural network.•Exponential convergence rates for analytic functions.
doi_str_mv 10.1016/j.neunet.2023.01.035
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2775950693</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0893608023000473</els_id><sourcerecordid>2775950693</sourcerecordid><originalsourceid>FETCH-LOGICAL-c362t-8194e95c8824a13371fa111b701fb04c6ddc99c5e01f03f9671cf7e19a81d1963</originalsourceid><addsrcrecordid>eNp9kE9v1DAQxS0EotvCN0DIRy4JnjjxnwsSqgpUqsQBOFteZyK8JHGwnV32yDfHbQpHTqOxfm-e3yPkFbAaGIi3h3rGdcZcN6zhNYOa8e4J2YGSumqkap6SHVOaV4IpdkEuUzowxoRq-XNywYWUrRJ8R35_8dM6ZjtjWBO1yxLDLz_Z7MNMw0AtTVMI-Tsd1tk9PNq5pz4n2mP0x8IdMdH9uay40PKhaMcy8inEH4mefFEuHh2efMJqCeN5DpMviC3Hjg8u6QV5Ntgx4cvHeUW-fbj5ev2puvv88fb6_V3luGhypUC3qDunVNNa4FzCYAFgLxkMe9Y60fdOa9dh2RkftJDgBomgrYIetOBX5M12t0T8uWLKZvLJ4Thu2U0jZac7JjQvaLuhLoaUIg5miaWUeDbAzH355mC28s19-YaBKeUX2etHh3U_Yf9P9LftArzbACw5jx6jSc7j7LD3EV02ffD_d_gDhsubEg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2775950693</pqid></control><display><type>article</type><title>Simultaneous approximation of a smooth function and its derivatives by deep neural networks with piecewise-polynomial activations</title><source>MEDLINE</source><source>Elsevier ScienceDirect Journals Complete</source><creator>Belomestny, Denis ; Naumov, Alexey ; Puchkin, Nikita ; Samsonov, Sergey</creator><creatorcontrib>Belomestny, Denis ; Naumov, Alexey ; Puchkin, Nikita ; Samsonov, Sergey</creatorcontrib><description>This paper investigates the approximation properties of deep neural networks with piecewise-polynomial activation functions. We derive the required depth, width, and sparsity of a deep neural network to approximate any Hölder smooth function up to a given approximation error in Hölder norms in such a way that all weights of this neural network are bounded by 1. The latter feature is essential to control generalization errors in many statistical and machine learning applications. •Rates and complexity for smooth function approximation in Hölder norms by ReQU neural networks.•Explicit and uniform bounds for weights of the approximating neural network.•Exponential convergence rates for analytic functions.</description><identifier>ISSN: 0893-6080</identifier><identifier>EISSN: 1879-2782</identifier><identifier>DOI: 10.1016/j.neunet.2023.01.035</identifier><identifier>PMID: 36774863</identifier><language>eng</language><publisher>United States: Elsevier Ltd</publisher><subject>Algorithms ; Approximation complexity ; Deep neural networks ; Hölder class ; Machine Learning ; Neural Networks, Computer ; ReLUk activations ; ReQU activations</subject><ispartof>Neural networks, 2023-04, Vol.161, p.242-253</ispartof><rights>2023 Elsevier Ltd</rights><rights>Copyright © 2023 Elsevier Ltd. All rights reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c362t-8194e95c8824a13371fa111b701fb04c6ddc99c5e01f03f9671cf7e19a81d1963</citedby><cites>FETCH-LOGICAL-c362t-8194e95c8824a13371fa111b701fb04c6ddc99c5e01f03f9671cf7e19a81d1963</cites><orcidid>0000-0002-9482-6430</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/j.neunet.2023.01.035$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,780,784,3550,27924,27925,45995</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/36774863$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Belomestny, Denis</creatorcontrib><creatorcontrib>Naumov, Alexey</creatorcontrib><creatorcontrib>Puchkin, Nikita</creatorcontrib><creatorcontrib>Samsonov, Sergey</creatorcontrib><title>Simultaneous approximation of a smooth function and its derivatives by deep neural networks with piecewise-polynomial activations</title><title>Neural networks</title><addtitle>Neural Netw</addtitle><description>This paper investigates the approximation properties of deep neural networks with piecewise-polynomial activation functions. We derive the required depth, width, and sparsity of a deep neural network to approximate any Hölder smooth function up to a given approximation error in Hölder norms in such a way that all weights of this neural network are bounded by 1. The latter feature is essential to control generalization errors in many statistical and machine learning applications. •Rates and complexity for smooth function approximation in Hölder norms by ReQU neural networks.•Explicit and uniform bounds for weights of the approximating neural network.•Exponential convergence rates for analytic functions.</description><subject>Algorithms</subject><subject>Approximation complexity</subject><subject>Deep neural networks</subject><subject>Hölder class</subject><subject>Machine Learning</subject><subject>Neural Networks, Computer</subject><subject>ReLUk activations</subject><subject>ReQU activations</subject><issn>0893-6080</issn><issn>1879-2782</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNp9kE9v1DAQxS0EotvCN0DIRy4JnjjxnwsSqgpUqsQBOFteZyK8JHGwnV32yDfHbQpHTqOxfm-e3yPkFbAaGIi3h3rGdcZcN6zhNYOa8e4J2YGSumqkap6SHVOaV4IpdkEuUzowxoRq-XNywYWUrRJ8R35_8dM6ZjtjWBO1yxLDLz_Z7MNMw0AtTVMI-Tsd1tk9PNq5pz4n2mP0x8IdMdH9uay40PKhaMcy8inEH4mefFEuHh2efMJqCeN5DpMviC3Hjg8u6QV5Ntgx4cvHeUW-fbj5ev2puvv88fb6_V3luGhypUC3qDunVNNa4FzCYAFgLxkMe9Y60fdOa9dh2RkftJDgBomgrYIetOBX5M12t0T8uWLKZvLJ4Thu2U0jZac7JjQvaLuhLoaUIg5miaWUeDbAzH355mC28s19-YaBKeUX2etHh3U_Yf9P9LftArzbACw5jx6jSc7j7LD3EV02ffD_d_gDhsubEg</recordid><startdate>202304</startdate><enddate>202304</enddate><creator>Belomestny, Denis</creator><creator>Naumov, Alexey</creator><creator>Puchkin, Nikita</creator><creator>Samsonov, Sergey</creator><general>Elsevier Ltd</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-9482-6430</orcidid></search><sort><creationdate>202304</creationdate><title>Simultaneous approximation of a smooth function and its derivatives by deep neural networks with piecewise-polynomial activations</title><author>Belomestny, Denis ; Naumov, Alexey ; Puchkin, Nikita ; Samsonov, Sergey</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c362t-8194e95c8824a13371fa111b701fb04c6ddc99c5e01f03f9671cf7e19a81d1963</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Algorithms</topic><topic>Approximation complexity</topic><topic>Deep neural networks</topic><topic>Hölder class</topic><topic>Machine Learning</topic><topic>Neural Networks, Computer</topic><topic>ReLUk activations</topic><topic>ReQU activations</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Belomestny, Denis</creatorcontrib><creatorcontrib>Naumov, Alexey</creatorcontrib><creatorcontrib>Puchkin, Nikita</creatorcontrib><creatorcontrib>Samsonov, Sergey</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Neural networks</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Belomestny, Denis</au><au>Naumov, Alexey</au><au>Puchkin, Nikita</au><au>Samsonov, Sergey</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Simultaneous approximation of a smooth function and its derivatives by deep neural networks with piecewise-polynomial activations</atitle><jtitle>Neural networks</jtitle><addtitle>Neural Netw</addtitle><date>2023-04</date><risdate>2023</risdate><volume>161</volume><spage>242</spage><epage>253</epage><pages>242-253</pages><issn>0893-6080</issn><eissn>1879-2782</eissn><abstract>This paper investigates the approximation properties of deep neural networks with piecewise-polynomial activation functions. We derive the required depth, width, and sparsity of a deep neural network to approximate any Hölder smooth function up to a given approximation error in Hölder norms in such a way that all weights of this neural network are bounded by 1. The latter feature is essential to control generalization errors in many statistical and machine learning applications. •Rates and complexity for smooth function approximation in Hölder norms by ReQU neural networks.•Explicit and uniform bounds for weights of the approximating neural network.•Exponential convergence rates for analytic functions.</abstract><cop>United States</cop><pub>Elsevier Ltd</pub><pmid>36774863</pmid><doi>10.1016/j.neunet.2023.01.035</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0002-9482-6430</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0893-6080
ispartof Neural networks, 2023-04, Vol.161, p.242-253
issn 0893-6080
1879-2782
language eng
recordid cdi_proquest_miscellaneous_2775950693
source MEDLINE; Elsevier ScienceDirect Journals Complete
subjects Algorithms
Approximation complexity
Deep neural networks
Hölder class
Machine Learning
Neural Networks, Computer
ReLUk activations
ReQU activations
title Simultaneous approximation of a smooth function and its derivatives by deep neural networks with piecewise-polynomial activations
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-03T10%3A15%3A04IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Simultaneous%20approximation%20of%20a%20smooth%20function%20and%20its%20derivatives%20by%20deep%20neural%20networks%20with%20piecewise-polynomial%20activations&rft.jtitle=Neural%20networks&rft.au=Belomestny,%20Denis&rft.date=2023-04&rft.volume=161&rft.spage=242&rft.epage=253&rft.pages=242-253&rft.issn=0893-6080&rft.eissn=1879-2782&rft_id=info:doi/10.1016/j.neunet.2023.01.035&rft_dat=%3Cproquest_cross%3E2775950693%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2775950693&rft_id=info:pmid/36774863&rft_els_id=S0893608023000473&rfr_iscdi=true