Theory of deep convolutional neural networks: Downsampling

Establishing a solid theoretical foundation for structured deep neural networks is greatly desired due to the successful applications of deep learning in various practical domains. This paper aims at an approximation theory of deep convolutional neural networks whose structures are induced by convol...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural networks 2020-04, Vol.124, p.319-327
1. Verfasser: Zhou, Ding-Xuan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 327
container_issue
container_start_page 319
container_title Neural networks
container_volume 124
creator Zhou, Ding-Xuan
description Establishing a solid theoretical foundation for structured deep neural networks is greatly desired due to the successful applications of deep learning in various practical domains. This paper aims at an approximation theory of deep convolutional neural networks whose structures are induced by convolutions. To overcome the difficulty in theoretical analysis of the networks with linearly increasing widths arising from convolutions, we introduce a downsampling operator to reduce the widths. We prove that the downsampled deep convolutional neural networks can be used to approximate ridge functions nicely, which hints some advantages of these structured networks in terms of approximation or modeling. We also prove that the output of any multi-layer fully-connected neural network can be realized by that of a downsampled deep convolutional neural network with free parameters of the same order, which shows that in general, the approximation ability of deep convolutional neural networks is at least as good as that of fully-connected networks. Finally, a theorem for approximating functions on Riemannian manifolds is presented, which demonstrates that deep convolutional neural networks can be used to learn manifold features of data.
doi_str_mv 10.1016/j.neunet.2020.01.018
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2353012938</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0893608020300204</els_id><sourcerecordid>2353012938</sourcerecordid><originalsourceid>FETCH-LOGICAL-c428t-16d6da651509a061bf4491567e4b8888ce4ed9496877a96bbf296aceaeace7d93</originalsourceid><addsrcrecordid>eNp9UMtOwzAQtBCIlsIfIJQjl4S1kzh2D0ioPKVKXMrZcpwNpKRxsJNW_XtSWjiyGu1cZna0Q8glhYgC5TfLqMG-wS5iwCACOkAckTEVmQxZJtgxGYOQcchBwIiceb8EAC6S-JSMYgYxZ0yOyXTxgdZtA1sGBWIbGNusbd13lW10HQwJ7oe6jXWffhrc203j9aqtq-b9nJyUuvZ4ceAJeXt8WMyew_nr08vsbh6ahIkupLzgheYpTUFq4DQvk0TSlGeY5GIYgwkWMpFcZJmWPM9LJrk2qHFYWSHjCbne322d_erRd2pVeYN1rRu0vVcsTmOgTMZikCZ7qXHWe4elal210m6rKKhda2qp9q2pXWsK6ICd7eqQ0OcrLP5MvzUNgtu9AIc_1xU65U2FjcGicmg6Vdjq_4RvDUiABQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2353012938</pqid></control><display><type>article</type><title>Theory of deep convolutional neural networks: Downsampling</title><source>ScienceDirect Journals (5 years ago - present)</source><creator>Zhou, Ding-Xuan</creator><creatorcontrib>Zhou, Ding-Xuan</creatorcontrib><description>Establishing a solid theoretical foundation for structured deep neural networks is greatly desired due to the successful applications of deep learning in various practical domains. This paper aims at an approximation theory of deep convolutional neural networks whose structures are induced by convolutions. To overcome the difficulty in theoretical analysis of the networks with linearly increasing widths arising from convolutions, we introduce a downsampling operator to reduce the widths. We prove that the downsampled deep convolutional neural networks can be used to approximate ridge functions nicely, which hints some advantages of these structured networks in terms of approximation or modeling. We also prove that the output of any multi-layer fully-connected neural network can be realized by that of a downsampled deep convolutional neural network with free parameters of the same order, which shows that in general, the approximation ability of deep convolutional neural networks is at least as good as that of fully-connected networks. Finally, a theorem for approximating functions on Riemannian manifolds is presented, which demonstrates that deep convolutional neural networks can be used to learn manifold features of data.</description><identifier>ISSN: 0893-6080</identifier><identifier>EISSN: 1879-2782</identifier><identifier>DOI: 10.1016/j.neunet.2020.01.018</identifier><identifier>PMID: 32036229</identifier><language>eng</language><publisher>United States: Elsevier Ltd</publisher><subject>Approximation theory ; Convolutional neural networks ; Deep learning ; Downsampling ; Filter masks</subject><ispartof>Neural networks, 2020-04, Vol.124, p.319-327</ispartof><rights>2020 Elsevier Ltd</rights><rights>Copyright © 2020 Elsevier Ltd. All rights reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c428t-16d6da651509a061bf4491567e4b8888ce4ed9496877a96bbf296aceaeace7d93</citedby><cites>FETCH-LOGICAL-c428t-16d6da651509a061bf4491567e4b8888ce4ed9496877a96bbf296aceaeace7d93</cites><orcidid>0000-0003-0224-9216</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/j.neunet.2020.01.018$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,780,784,3550,27924,27925,45995</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/32036229$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Zhou, Ding-Xuan</creatorcontrib><title>Theory of deep convolutional neural networks: Downsampling</title><title>Neural networks</title><addtitle>Neural Netw</addtitle><description>Establishing a solid theoretical foundation for structured deep neural networks is greatly desired due to the successful applications of deep learning in various practical domains. This paper aims at an approximation theory of deep convolutional neural networks whose structures are induced by convolutions. To overcome the difficulty in theoretical analysis of the networks with linearly increasing widths arising from convolutions, we introduce a downsampling operator to reduce the widths. We prove that the downsampled deep convolutional neural networks can be used to approximate ridge functions nicely, which hints some advantages of these structured networks in terms of approximation or modeling. We also prove that the output of any multi-layer fully-connected neural network can be realized by that of a downsampled deep convolutional neural network with free parameters of the same order, which shows that in general, the approximation ability of deep convolutional neural networks is at least as good as that of fully-connected networks. Finally, a theorem for approximating functions on Riemannian manifolds is presented, which demonstrates that deep convolutional neural networks can be used to learn manifold features of data.</description><subject>Approximation theory</subject><subject>Convolutional neural networks</subject><subject>Deep learning</subject><subject>Downsampling</subject><subject>Filter masks</subject><issn>0893-6080</issn><issn>1879-2782</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><recordid>eNp9UMtOwzAQtBCIlsIfIJQjl4S1kzh2D0ioPKVKXMrZcpwNpKRxsJNW_XtSWjiyGu1cZna0Q8glhYgC5TfLqMG-wS5iwCACOkAckTEVmQxZJtgxGYOQcchBwIiceb8EAC6S-JSMYgYxZ0yOyXTxgdZtA1sGBWIbGNusbd13lW10HQwJ7oe6jXWffhrc203j9aqtq-b9nJyUuvZ4ceAJeXt8WMyew_nr08vsbh6ahIkupLzgheYpTUFq4DQvk0TSlGeY5GIYgwkWMpFcZJmWPM9LJrk2qHFYWSHjCbne322d_erRd2pVeYN1rRu0vVcsTmOgTMZikCZ7qXHWe4elal210m6rKKhda2qp9q2pXWsK6ICd7eqQ0OcrLP5MvzUNgtu9AIc_1xU65U2FjcGicmg6Vdjq_4RvDUiABQ</recordid><startdate>202004</startdate><enddate>202004</enddate><creator>Zhou, Ding-Xuan</creator><general>Elsevier Ltd</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0003-0224-9216</orcidid></search><sort><creationdate>202004</creationdate><title>Theory of deep convolutional neural networks: Downsampling</title><author>Zhou, Ding-Xuan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c428t-16d6da651509a061bf4491567e4b8888ce4ed9496877a96bbf296aceaeace7d93</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Approximation theory</topic><topic>Convolutional neural networks</topic><topic>Deep learning</topic><topic>Downsampling</topic><topic>Filter masks</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zhou, Ding-Xuan</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Neural networks</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Zhou, Ding-Xuan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Theory of deep convolutional neural networks: Downsampling</atitle><jtitle>Neural networks</jtitle><addtitle>Neural Netw</addtitle><date>2020-04</date><risdate>2020</risdate><volume>124</volume><spage>319</spage><epage>327</epage><pages>319-327</pages><issn>0893-6080</issn><eissn>1879-2782</eissn><abstract>Establishing a solid theoretical foundation for structured deep neural networks is greatly desired due to the successful applications of deep learning in various practical domains. This paper aims at an approximation theory of deep convolutional neural networks whose structures are induced by convolutions. To overcome the difficulty in theoretical analysis of the networks with linearly increasing widths arising from convolutions, we introduce a downsampling operator to reduce the widths. We prove that the downsampled deep convolutional neural networks can be used to approximate ridge functions nicely, which hints some advantages of these structured networks in terms of approximation or modeling. We also prove that the output of any multi-layer fully-connected neural network can be realized by that of a downsampled deep convolutional neural network with free parameters of the same order, which shows that in general, the approximation ability of deep convolutional neural networks is at least as good as that of fully-connected networks. Finally, a theorem for approximating functions on Riemannian manifolds is presented, which demonstrates that deep convolutional neural networks can be used to learn manifold features of data.</abstract><cop>United States</cop><pub>Elsevier Ltd</pub><pmid>32036229</pmid><doi>10.1016/j.neunet.2020.01.018</doi><tpages>9</tpages><orcidid>https://orcid.org/0000-0003-0224-9216</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0893-6080
ispartof Neural networks, 2020-04, Vol.124, p.319-327
issn 0893-6080
1879-2782
language eng
recordid cdi_proquest_miscellaneous_2353012938
source ScienceDirect Journals (5 years ago - present)
subjects Approximation theory
Convolutional neural networks
Deep learning
Downsampling
Filter masks
title Theory of deep convolutional neural networks: Downsampling
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-05T17%3A42%3A12IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Theory%20of%20deep%20convolutional%20neural%20networks:%20Downsampling&rft.jtitle=Neural%20networks&rft.au=Zhou,%20Ding-Xuan&rft.date=2020-04&rft.volume=124&rft.spage=319&rft.epage=327&rft.pages=319-327&rft.issn=0893-6080&rft.eissn=1879-2782&rft_id=info:doi/10.1016/j.neunet.2020.01.018&rft_dat=%3Cproquest_cross%3E2353012938%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2353012938&rft_id=info:pmid/32036229&rft_els_id=S0893608020300204&rfr_iscdi=true