Design of General Projection Neural Networks for Solving Monotone Linear Variational Inequalities and Linear and Quadratic Optimization Problems

Most existing neural networks for solving linear variational inequalities (LVIs) with the mapping Mx + p require positive definiteness (or positive semidefiniteness) of M. In this correspondence, it is revealed that this condition is sufficient but not necessary for an LVI being strictly monotone (o...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on cybernetics 2007-10, Vol.37 (5), p.1414-1421
Hauptverfasser: Hu, Xiaolin, Wang, Jun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1421
container_issue 5
container_start_page 1414
container_title IEEE transactions on cybernetics
container_volume 37
creator Hu, Xiaolin
Wang, Jun
description Most existing neural networks for solving linear variational inequalities (LVIs) with the mapping Mx + p require positive definiteness (or positive semidefiniteness) of M. In this correspondence, it is revealed that this condition is sufficient but not necessary for an LVI being strictly monotone (or monotone) on its constrained set where equality constraints are present. Then, it is proposed to reformulate monotone LVIs with equality constraints into LVIs with inequality constraints only, which are then possible to be solved by using some existing neural networks. General projection neural networks are designed in this correspondence for solving the transformed LVIs. Compared with existing neural networks, the designed neural networks feature lower model complexity. Moreover, the neural networks are guaranteed to be globally convergent to solutions of the LVI under the condition that the linear mapping Mx + p is monotone on the constrained set. Because quadratic and linear programming problems are special cases of LVI in terms of solutions, the designed neural networks can solve them efficiently as well. In addition, it is discovered that the designed neural network in a specific case turns out to be the primal-dual network for solving quadratic or linear programming problems. The effectiveness of the neural networks is illustrated by several numerical examples.
doi_str_mv 10.1109/TSMCB.2007.903706
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_miscellaneous_880659861</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>4305278</ieee_id><sourcerecordid>880659861</sourcerecordid><originalsourceid>FETCH-LOGICAL-c378t-35b033bad40b70f888017a03cb5da0189372c4f924325f62a21c6348191dfaa23</originalsourceid><addsrcrecordid>eNp9kcFu1DAQhiMEoqXwAAgJWRzoKcuM7cTxERYolbYtqIVr5CSTykvW3toJCJ6ij4zTXUDiwGlGM9__jzR_lj1FWCCCfnV1ebZ8s-AAaqFBKCjvZYeoJeYgNb-feqhELiXqg-xRjGsA0KDVw-wAleal4vwwu31L0V475nt2Qo6CGdjH4NfUjtY7dk7TPDmn8bsPXyPrfWCXfvhm3TU7886P3hFbWUcmsC8mWDOrkuDU0c1kBjtaisy47jczt58m04UEtuxiO9qN_Xknmq82A23i4-xBb4ZIT_b1KPv8_t3V8kO-ujg5Xb5e5a1Q1ZiLogEhGtNJaBT0VVUBKgOibYrOAFZaKN7KXnMpeNGX3HBsSyEr1Nj1xnBxlB3vfLfB30wUx3pjY0vDYBz5KdbJsCx0VWIiX_6XLKv0ehQqgS_-Add-Cukfya2UiCXifBd3UBt8jIH6ehvsxoQfNUI9p1rfpVrPqda7VJPm-d54ajbU_VXsY0zAsx1giejPWgoouKrEL7K1pyA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>864116112</pqid></control><display><type>article</type><title>Design of General Projection Neural Networks for Solving Monotone Linear Variational Inequalities and Linear and Quadratic Optimization Problems</title><source>IEEE Electronic Library (IEL)</source><creator>Hu, Xiaolin ; Wang, Jun</creator><creatorcontrib>Hu, Xiaolin ; Wang, Jun</creatorcontrib><description>Most existing neural networks for solving linear variational inequalities (LVIs) with the mapping Mx + p require positive definiteness (or positive semidefiniteness) of M. In this correspondence, it is revealed that this condition is sufficient but not necessary for an LVI being strictly monotone (or monotone) on its constrained set where equality constraints are present. Then, it is proposed to reformulate monotone LVIs with equality constraints into LVIs with inequality constraints only, which are then possible to be solved by using some existing neural networks. General projection neural networks are designed in this correspondence for solving the transformed LVIs. Compared with existing neural networks, the designed neural networks feature lower model complexity. Moreover, the neural networks are guaranteed to be globally convergent to solutions of the LVI under the condition that the linear mapping Mx + p is monotone on the constrained set. Because quadratic and linear programming problems are special cases of LVI in terms of solutions, the designed neural networks can solve them efficiently as well. In addition, it is discovered that the designed neural network in a specific case turns out to be the primal-dual network for solving quadratic or linear programming problems. The effectiveness of the neural networks is illustrated by several numerical examples.</description><identifier>ISSN: 1083-4419</identifier><identifier>ISSN: 2168-2267</identifier><identifier>EISSN: 1941-0492</identifier><identifier>EISSN: 2168-2275</identifier><identifier>DOI: 10.1109/TSMCB.2007.903706</identifier><identifier>PMID: 17926722</identifier><identifier>CODEN: ITSCFI</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Algorithms ; Artificial Intelligence ; Automation ; Computer Simulation ; Constraint optimization ; Constraints ; Convergence ; Councils ; Cybernetics ; Design optimization ; Global convergence ; Inequalities ; Linear Models ; Linear programming ; linear variational inequality (LVI) ; Mapping ; Mathematical models ; Neural networks ; Neural Networks (Computer) ; Pattern Recognition, Automated - methods ; Projection ; Quadratic programming ; recurrent neural network ; Recurrent neural networks ; Regression analysis ; Studies</subject><ispartof>IEEE transactions on cybernetics, 2007-10, Vol.37 (5), p.1414-1421</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2007</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c378t-35b033bad40b70f888017a03cb5da0189372c4f924325f62a21c6348191dfaa23</citedby><cites>FETCH-LOGICAL-c378t-35b033bad40b70f888017a03cb5da0189372c4f924325f62a21c6348191dfaa23</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/4305278$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/4305278$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/17926722$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Hu, Xiaolin</creatorcontrib><creatorcontrib>Wang, Jun</creatorcontrib><title>Design of General Projection Neural Networks for Solving Monotone Linear Variational Inequalities and Linear and Quadratic Optimization Problems</title><title>IEEE transactions on cybernetics</title><addtitle>TSMCB</addtitle><addtitle>IEEE Trans Syst Man Cybern B Cybern</addtitle><description>Most existing neural networks for solving linear variational inequalities (LVIs) with the mapping Mx + p require positive definiteness (or positive semidefiniteness) of M. In this correspondence, it is revealed that this condition is sufficient but not necessary for an LVI being strictly monotone (or monotone) on its constrained set where equality constraints are present. Then, it is proposed to reformulate monotone LVIs with equality constraints into LVIs with inequality constraints only, which are then possible to be solved by using some existing neural networks. General projection neural networks are designed in this correspondence for solving the transformed LVIs. Compared with existing neural networks, the designed neural networks feature lower model complexity. Moreover, the neural networks are guaranteed to be globally convergent to solutions of the LVI under the condition that the linear mapping Mx + p is monotone on the constrained set. Because quadratic and linear programming problems are special cases of LVI in terms of solutions, the designed neural networks can solve them efficiently as well. In addition, it is discovered that the designed neural network in a specific case turns out to be the primal-dual network for solving quadratic or linear programming problems. The effectiveness of the neural networks is illustrated by several numerical examples.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Automation</subject><subject>Computer Simulation</subject><subject>Constraint optimization</subject><subject>Constraints</subject><subject>Convergence</subject><subject>Councils</subject><subject>Cybernetics</subject><subject>Design optimization</subject><subject>Global convergence</subject><subject>Inequalities</subject><subject>Linear Models</subject><subject>Linear programming</subject><subject>linear variational inequality (LVI)</subject><subject>Mapping</subject><subject>Mathematical models</subject><subject>Neural networks</subject><subject>Neural Networks (Computer)</subject><subject>Pattern Recognition, Automated - methods</subject><subject>Projection</subject><subject>Quadratic programming</subject><subject>recurrent neural network</subject><subject>Recurrent neural networks</subject><subject>Regression analysis</subject><subject>Studies</subject><issn>1083-4419</issn><issn>2168-2267</issn><issn>1941-0492</issn><issn>2168-2275</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2007</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><sourceid>EIF</sourceid><recordid>eNp9kcFu1DAQhiMEoqXwAAgJWRzoKcuM7cTxERYolbYtqIVr5CSTykvW3toJCJ6ij4zTXUDiwGlGM9__jzR_lj1FWCCCfnV1ebZ8s-AAaqFBKCjvZYeoJeYgNb-feqhELiXqg-xRjGsA0KDVw-wAleal4vwwu31L0V475nt2Qo6CGdjH4NfUjtY7dk7TPDmn8bsPXyPrfWCXfvhm3TU7886P3hFbWUcmsC8mWDOrkuDU0c1kBjtaisy47jczt58m04UEtuxiO9qN_Xknmq82A23i4-xBb4ZIT_b1KPv8_t3V8kO-ujg5Xb5e5a1Q1ZiLogEhGtNJaBT0VVUBKgOibYrOAFZaKN7KXnMpeNGX3HBsSyEr1Nj1xnBxlB3vfLfB30wUx3pjY0vDYBz5KdbJsCx0VWIiX_6XLKv0ehQqgS_-Add-Cukfya2UiCXifBd3UBt8jIH6ehvsxoQfNUI9p1rfpVrPqda7VJPm-d54ajbU_VXsY0zAsx1giejPWgoouKrEL7K1pyA</recordid><startdate>20071001</startdate><enddate>20071001</enddate><creator>Hu, Xiaolin</creator><creator>Wang, Jun</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>7X8</scope></search><sort><creationdate>20071001</creationdate><title>Design of General Projection Neural Networks for Solving Monotone Linear Variational Inequalities and Linear and Quadratic Optimization Problems</title><author>Hu, Xiaolin ; Wang, Jun</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c378t-35b033bad40b70f888017a03cb5da0189372c4f924325f62a21c6348191dfaa23</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2007</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Automation</topic><topic>Computer Simulation</topic><topic>Constraint optimization</topic><topic>Constraints</topic><topic>Convergence</topic><topic>Councils</topic><topic>Cybernetics</topic><topic>Design optimization</topic><topic>Global convergence</topic><topic>Inequalities</topic><topic>Linear Models</topic><topic>Linear programming</topic><topic>linear variational inequality (LVI)</topic><topic>Mapping</topic><topic>Mathematical models</topic><topic>Neural networks</topic><topic>Neural Networks (Computer)</topic><topic>Pattern Recognition, Automated - methods</topic><topic>Projection</topic><topic>Quadratic programming</topic><topic>recurrent neural network</topic><topic>Recurrent neural networks</topic><topic>Regression analysis</topic><topic>Studies</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Hu, Xiaolin</creatorcontrib><creatorcontrib>Wang, Jun</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005–Present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transactions on cybernetics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Hu, Xiaolin</au><au>Wang, Jun</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Design of General Projection Neural Networks for Solving Monotone Linear Variational Inequalities and Linear and Quadratic Optimization Problems</atitle><jtitle>IEEE transactions on cybernetics</jtitle><stitle>TSMCB</stitle><addtitle>IEEE Trans Syst Man Cybern B Cybern</addtitle><date>2007-10-01</date><risdate>2007</risdate><volume>37</volume><issue>5</issue><spage>1414</spage><epage>1421</epage><pages>1414-1421</pages><issn>1083-4419</issn><issn>2168-2267</issn><eissn>1941-0492</eissn><eissn>2168-2275</eissn><coden>ITSCFI</coden><abstract>Most existing neural networks for solving linear variational inequalities (LVIs) with the mapping Mx + p require positive definiteness (or positive semidefiniteness) of M. In this correspondence, it is revealed that this condition is sufficient but not necessary for an LVI being strictly monotone (or monotone) on its constrained set where equality constraints are present. Then, it is proposed to reformulate monotone LVIs with equality constraints into LVIs with inequality constraints only, which are then possible to be solved by using some existing neural networks. General projection neural networks are designed in this correspondence for solving the transformed LVIs. Compared with existing neural networks, the designed neural networks feature lower model complexity. Moreover, the neural networks are guaranteed to be globally convergent to solutions of the LVI under the condition that the linear mapping Mx + p is monotone on the constrained set. Because quadratic and linear programming problems are special cases of LVI in terms of solutions, the designed neural networks can solve them efficiently as well. In addition, it is discovered that the designed neural network in a specific case turns out to be the primal-dual network for solving quadratic or linear programming problems. The effectiveness of the neural networks is illustrated by several numerical examples.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>17926722</pmid><doi>10.1109/TSMCB.2007.903706</doi><tpages>8</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1083-4419
ispartof IEEE transactions on cybernetics, 2007-10, Vol.37 (5), p.1414-1421
issn 1083-4419
2168-2267
1941-0492
2168-2275
language eng
recordid cdi_proquest_miscellaneous_880659861
source IEEE Electronic Library (IEL)
subjects Algorithms
Artificial Intelligence
Automation
Computer Simulation
Constraint optimization
Constraints
Convergence
Councils
Cybernetics
Design optimization
Global convergence
Inequalities
Linear Models
Linear programming
linear variational inequality (LVI)
Mapping
Mathematical models
Neural networks
Neural Networks (Computer)
Pattern Recognition, Automated - methods
Projection
Quadratic programming
recurrent neural network
Recurrent neural networks
Regression analysis
Studies
title Design of General Projection Neural Networks for Solving Monotone Linear Variational Inequalities and Linear and Quadratic Optimization Problems
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-24T21%3A30%3A28IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Design%20of%20General%20Projection%20Neural%20Networks%20for%20Solving%20Monotone%20Linear%20Variational%20Inequalities%20and%20Linear%20and%20Quadratic%20Optimization%20Problems&rft.jtitle=IEEE%20transactions%20on%20cybernetics&rft.au=Hu,%20Xiaolin&rft.date=2007-10-01&rft.volume=37&rft.issue=5&rft.spage=1414&rft.epage=1421&rft.pages=1414-1421&rft.issn=1083-4419&rft.eissn=1941-0492&rft.coden=ITSCFI&rft_id=info:doi/10.1109/TSMCB.2007.903706&rft_dat=%3Cproquest_RIE%3E880659861%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=864116112&rft_id=info:pmid/17926722&rft_ieee_id=4305278&rfr_iscdi=true