Smoothing Neural Network for Constrained Non-Lipschitz Optimization With Applications

In this paper, a smoothing neural network (SNN) is proposed for a class of constrained non-Lipschitz optimization problems, where the objective function is the sum of a nonsmooth, nonconvex function, and a non-Lipschitz function, and the feasible set is a closed convex subset of . Using the smoothin...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2012-03, Vol.23 (3), p.399-411
Hauptverfasser: Bian, Wei, Chen, Xiaojun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 411
container_issue 3
container_start_page 399
container_title IEEE transaction on neural networks and learning systems
container_volume 23
creator Bian, Wei
Chen, Xiaojun
description In this paper, a smoothing neural network (SNN) is proposed for a class of constrained non-Lipschitz optimization problems, where the objective function is the sum of a nonsmooth, nonconvex function, and a non-Lipschitz function, and the feasible set is a closed convex subset of . Using the smoothing approximate techniques, the proposed neural network is modeled by a differential equation, which can be implemented easily. Under the level bounded condition on the objective function in the feasible set, we prove the global existence and uniform boundedness of the solutions of the SNN with any initial point in the feasible set. The uniqueness of the solution of the SNN is provided under the Lipschitz property of smoothing functions. We show that any accumulation point of the solutions of the SNN is a stationary point of the optimization problem. Numerical results including image restoration, blind source separation, variable selection, and minimizing condition number are presented to illustrate the theoretical results and show the efficiency of the SNN. Comparisons with some existing algorithms show the advantages of the SNN.
doi_str_mv 10.1109/TNNLS.2011.2181867
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_TNNLS_2011_2181867</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>6123210</ieee_id><sourcerecordid>2597240521</sourcerecordid><originalsourceid>FETCH-LOGICAL-c413t-636d3b7fdbb903033a92f4955c69342c9f09cfb0a19d025f827bbc46e369dc0e3</originalsourceid><addsrcrecordid>eNqFkU1LxDAQhoMoKqt_QEGKIHjpmu82R1n8gmU9qOitpGniRtumJimiv966u67gxbnMkDzzwvAAcIDgGCEozu5ns-ndGEOExhjlKOfZBtjFiOMUkzzfXM_Z0w7YD-EFDsUh41Rsgx1Mc5gzmu2Ch7vGuTi37XMy072X9dDiu_OviXE-mbg2RC9tq6tk5tp0arug5jZ-JrddtI39lNG6Nnm0cZ6cd11t1eIh7IEtI-ug91d9BB4uL-4n1-n09upmcj5NFUUkppzwipSZqcpSQAIJkQIbKhhTXBCKlTBQKFNCiUQFMTM5zspSUa4JF5WCmozA6TK38-6t1yEWjQ1K17VstetDgRgmFDLE6P8oRIIzQRgf0OM_6IvrfTscUghMBOY5ZQOEl5DyLgSvTdF520j_MSQV34aKhaHi21CxMjQsHa2S-7LR1Xrlx8cAnKwAGZSsjZetsuGXYxxBuuAOl5zVWq-_OcIEI0i-ADLPoOc</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>923926845</pqid></control><display><type>article</type><title>Smoothing Neural Network for Constrained Non-Lipschitz Optimization With Applications</title><source>IEEE Electronic Library (IEL)</source><creator>Bian, Wei ; Chen, Xiaojun</creator><creatorcontrib>Bian, Wei ; Chen, Xiaojun</creatorcontrib><description>In this paper, a smoothing neural network (SNN) is proposed for a class of constrained non-Lipschitz optimization problems, where the objective function is the sum of a nonsmooth, nonconvex function, and a non-Lipschitz function, and the feasible set is a closed convex subset of . Using the smoothing approximate techniques, the proposed neural network is modeled by a differential equation, which can be implemented easily. Under the level bounded condition on the objective function in the feasible set, we prove the global existence and uniform boundedness of the solutions of the SNN with any initial point in the feasible set. The uniqueness of the solution of the SNN is provided under the Lipschitz property of smoothing functions. We show that any accumulation point of the solutions of the SNN is a stationary point of the optimization problem. Numerical results including image restoration, blind source separation, variable selection, and minimizing condition number are presented to illustrate the theoretical results and show the efficiency of the SNN. Comparisons with some existing algorithms show the advantages of the SNN.</description><identifier>ISSN: 2162-237X</identifier><identifier>EISSN: 2162-2388</identifier><identifier>DOI: 10.1109/TNNLS.2011.2181867</identifier><identifier>PMID: 24808547</identifier><identifier>CODEN: ITNNAL</identifier><language>eng</language><publisher>New York, NY: IEEE</publisher><subject>Applied sciences ; Approximation methods ; Artificial intelligence ; Computer science; control theory; systems ; Connectionism. Neural networks ; Detection, estimation, filtering, equalization, prediction ; Differential equations ; Exact sciences and technology ; Image and signal restoration ; Information, signal and communications theory ; Input variables ; Mathematical model ; Mathematical models ; Models, Theoretical ; Neural networks ; Neural Networks (Computer) ; non-Lipschitz optimization ; Optimization ; Pattern recognition. Digital image processing. Computational geometry ; Signal and communications theory ; Signal, noise ; Smoothing methods ; smoothing neural network ; stationary point ; Studies ; Telecommunications and information theory ; variable selection</subject><ispartof>IEEE transaction on neural networks and learning systems, 2012-03, Vol.23 (3), p.399-411</ispartof><rights>2015 INIST-CNRS</rights><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Mar 2012</rights><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c413t-636d3b7fdbb903033a92f4955c69342c9f09cfb0a19d025f827bbc46e369dc0e3</citedby><cites>FETCH-LOGICAL-c413t-636d3b7fdbb903033a92f4955c69342c9f09cfb0a19d025f827bbc46e369dc0e3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/6123210$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/6123210$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=25610447$$DView record in Pascal Francis$$Hfree_for_read</backlink><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/24808547$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Bian, Wei</creatorcontrib><creatorcontrib>Chen, Xiaojun</creatorcontrib><title>Smoothing Neural Network for Constrained Non-Lipschitz Optimization With Applications</title><title>IEEE transaction on neural networks and learning systems</title><addtitle>TNNLS</addtitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><description>In this paper, a smoothing neural network (SNN) is proposed for a class of constrained non-Lipschitz optimization problems, where the objective function is the sum of a nonsmooth, nonconvex function, and a non-Lipschitz function, and the feasible set is a closed convex subset of . Using the smoothing approximate techniques, the proposed neural network is modeled by a differential equation, which can be implemented easily. Under the level bounded condition on the objective function in the feasible set, we prove the global existence and uniform boundedness of the solutions of the SNN with any initial point in the feasible set. The uniqueness of the solution of the SNN is provided under the Lipschitz property of smoothing functions. We show that any accumulation point of the solutions of the SNN is a stationary point of the optimization problem. Numerical results including image restoration, blind source separation, variable selection, and minimizing condition number are presented to illustrate the theoretical results and show the efficiency of the SNN. Comparisons with some existing algorithms show the advantages of the SNN.</description><subject>Applied sciences</subject><subject>Approximation methods</subject><subject>Artificial intelligence</subject><subject>Computer science; control theory; systems</subject><subject>Connectionism. Neural networks</subject><subject>Detection, estimation, filtering, equalization, prediction</subject><subject>Differential equations</subject><subject>Exact sciences and technology</subject><subject>Image and signal restoration</subject><subject>Information, signal and communications theory</subject><subject>Input variables</subject><subject>Mathematical model</subject><subject>Mathematical models</subject><subject>Models, Theoretical</subject><subject>Neural networks</subject><subject>Neural Networks (Computer)</subject><subject>non-Lipschitz optimization</subject><subject>Optimization</subject><subject>Pattern recognition. Digital image processing. Computational geometry</subject><subject>Signal and communications theory</subject><subject>Signal, noise</subject><subject>Smoothing methods</subject><subject>smoothing neural network</subject><subject>stationary point</subject><subject>Studies</subject><subject>Telecommunications and information theory</subject><subject>variable selection</subject><issn>2162-237X</issn><issn>2162-2388</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2012</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><sourceid>EIF</sourceid><recordid>eNqFkU1LxDAQhoMoKqt_QEGKIHjpmu82R1n8gmU9qOitpGniRtumJimiv966u67gxbnMkDzzwvAAcIDgGCEozu5ns-ndGEOExhjlKOfZBtjFiOMUkzzfXM_Z0w7YD-EFDsUh41Rsgx1Mc5gzmu2Ch7vGuTi37XMy072X9dDiu_OviXE-mbg2RC9tq6tk5tp0arug5jZ-JrddtI39lNG6Nnm0cZ6cd11t1eIh7IEtI-ug91d9BB4uL-4n1-n09upmcj5NFUUkppzwipSZqcpSQAIJkQIbKhhTXBCKlTBQKFNCiUQFMTM5zspSUa4JF5WCmozA6TK38-6t1yEWjQ1K17VstetDgRgmFDLE6P8oRIIzQRgf0OM_6IvrfTscUghMBOY5ZQOEl5DyLgSvTdF520j_MSQV34aKhaHi21CxMjQsHa2S-7LR1Xrlx8cAnKwAGZSsjZetsuGXYxxBuuAOl5zVWq-_OcIEI0i-ADLPoOc</recordid><startdate>20120301</startdate><enddate>20120301</enddate><creator>Bian, Wei</creator><creator>Chen, Xiaojun</creator><general>IEEE</general><general>Institute of Electrical and Electronics Engineers</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>IQODW</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QP</scope><scope>7QQ</scope><scope>7QR</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>P64</scope><scope>7X8</scope></search><sort><creationdate>20120301</creationdate><title>Smoothing Neural Network for Constrained Non-Lipschitz Optimization With Applications</title><author>Bian, Wei ; Chen, Xiaojun</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c413t-636d3b7fdbb903033a92f4955c69342c9f09cfb0a19d025f827bbc46e369dc0e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2012</creationdate><topic>Applied sciences</topic><topic>Approximation methods</topic><topic>Artificial intelligence</topic><topic>Computer science; control theory; systems</topic><topic>Connectionism. Neural networks</topic><topic>Detection, estimation, filtering, equalization, prediction</topic><topic>Differential equations</topic><topic>Exact sciences and technology</topic><topic>Image and signal restoration</topic><topic>Information, signal and communications theory</topic><topic>Input variables</topic><topic>Mathematical model</topic><topic>Mathematical models</topic><topic>Models, Theoretical</topic><topic>Neural networks</topic><topic>Neural Networks (Computer)</topic><topic>non-Lipschitz optimization</topic><topic>Optimization</topic><topic>Pattern recognition. Digital image processing. Computational geometry</topic><topic>Signal and communications theory</topic><topic>Signal, noise</topic><topic>Smoothing methods</topic><topic>smoothing neural network</topic><topic>stationary point</topic><topic>Studies</topic><topic>Telecommunications and information theory</topic><topic>variable selection</topic><toplevel>online_resources</toplevel><creatorcontrib>Bian, Wei</creatorcontrib><creatorcontrib>Chen, Xiaojun</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Pascal-Francis</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Calcium &amp; Calcified Tissue Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transaction on neural networks and learning systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Bian, Wei</au><au>Chen, Xiaojun</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Smoothing Neural Network for Constrained Non-Lipschitz Optimization With Applications</atitle><jtitle>IEEE transaction on neural networks and learning systems</jtitle><stitle>TNNLS</stitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><date>2012-03-01</date><risdate>2012</risdate><volume>23</volume><issue>3</issue><spage>399</spage><epage>411</epage><pages>399-411</pages><issn>2162-237X</issn><eissn>2162-2388</eissn><coden>ITNNAL</coden><abstract>In this paper, a smoothing neural network (SNN) is proposed for a class of constrained non-Lipschitz optimization problems, where the objective function is the sum of a nonsmooth, nonconvex function, and a non-Lipschitz function, and the feasible set is a closed convex subset of . Using the smoothing approximate techniques, the proposed neural network is modeled by a differential equation, which can be implemented easily. Under the level bounded condition on the objective function in the feasible set, we prove the global existence and uniform boundedness of the solutions of the SNN with any initial point in the feasible set. The uniqueness of the solution of the SNN is provided under the Lipschitz property of smoothing functions. We show that any accumulation point of the solutions of the SNN is a stationary point of the optimization problem. Numerical results including image restoration, blind source separation, variable selection, and minimizing condition number are presented to illustrate the theoretical results and show the efficiency of the SNN. Comparisons with some existing algorithms show the advantages of the SNN.</abstract><cop>New York, NY</cop><pub>IEEE</pub><pmid>24808547</pmid><doi>10.1109/TNNLS.2011.2181867</doi><tpages>13</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 2162-237X
ispartof IEEE transaction on neural networks and learning systems, 2012-03, Vol.23 (3), p.399-411
issn 2162-237X
2162-2388
language eng
recordid cdi_crossref_primary_10_1109_TNNLS_2011_2181867
source IEEE Electronic Library (IEL)
subjects Applied sciences
Approximation methods
Artificial intelligence
Computer science
control theory
systems
Connectionism. Neural networks
Detection, estimation, filtering, equalization, prediction
Differential equations
Exact sciences and technology
Image and signal restoration
Information, signal and communications theory
Input variables
Mathematical model
Mathematical models
Models, Theoretical
Neural networks
Neural Networks (Computer)
non-Lipschitz optimization
Optimization
Pattern recognition. Digital image processing. Computational geometry
Signal and communications theory
Signal, noise
Smoothing methods
smoothing neural network
stationary point
Studies
Telecommunications and information theory
variable selection
title Smoothing Neural Network for Constrained Non-Lipschitz Optimization With Applications
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-26T18%3A09%3A49IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Smoothing%20Neural%20Network%20for%20Constrained%20Non-Lipschitz%20Optimization%20With%20Applications&rft.jtitle=IEEE%20transaction%20on%20neural%20networks%20and%20learning%20systems&rft.au=Bian,%20Wei&rft.date=2012-03-01&rft.volume=23&rft.issue=3&rft.spage=399&rft.epage=411&rft.pages=399-411&rft.issn=2162-237X&rft.eissn=2162-2388&rft.coden=ITNNAL&rft_id=info:doi/10.1109/TNNLS.2011.2181867&rft_dat=%3Cproquest_RIE%3E2597240521%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=923926845&rft_id=info:pmid/24808547&rft_ieee_id=6123210&rfr_iscdi=true