On the Convergence of Block Majorization-Minimization Algorithms on the Grassmann Manifold

The Majorization-Minimization (MM) framework is widely used to derive efficient algorithms for specific problems that require the optimization of a cost function (which can be convex or not). It is based on a sequential optimization of a surrogate function over closed convex sets. A natural extensio...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE signal processing letters 2024, Vol.31, p.1314-1318
Hauptverfasser: Lopez, Carlos Alejandro, Riba, Jaume
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1318
container_issue
container_start_page 1314
container_title IEEE signal processing letters
container_volume 31
creator Lopez, Carlos Alejandro
Riba, Jaume
description The Majorization-Minimization (MM) framework is widely used to derive efficient algorithms for specific problems that require the optimization of a cost function (which can be convex or not). It is based on a sequential optimization of a surrogate function over closed convex sets. A natural extension of this framework incorporates ideas of Block Coordinate Descent (BCD) algorithms into the MM framework, also known as block MM. The rationale behind the block extension is to partition the optimization variables into several independent blocks, to obtain a surrogate for each block, and to optimize the surrogate of each block cyclically. However, known convergence proofs of the block MM are only valid under the assumption that the constraint sets are closed and convex. Hence, the global convergence of the block MM is not ensured for non-convex sets by classical proofs, which is needed in iterative schemes that naturally emerge in a wide range of subspace-based signal processing applications. For this purpose, the aim of this letter is to review the convergence proof of the block MM and extend it for blocks constrained in the Grassmann manifold.
doi_str_mv 10.1109/LSP.2024.3396660
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_3055179006</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10518081</ieee_id><sourcerecordid>3055179006</sourcerecordid><originalsourceid>FETCH-LOGICAL-c287t-c4226c2b8315ba612e3f1c5ed689849bdc5291149a878b4053395ea5a77bf9ac3</originalsourceid><addsrcrecordid>eNpNkDFPwzAQhS0EEqWwMzBEYk4527Fjj6WCgtSqSMDCYjmu06YkdrFTJPj1uGoHprvTvfdO9yF0jWGEMci72evLiAApRpRKzjmcoAFmTOSEcnyaeighlxLEObqIcQMAAgs2QB8Ll_Vrm028-7ZhZZ2xma-z-9abz2yuNz40v7pvvMvnjWu645CN21Xa9OsuZv4QMA06xk47l1yuqX27vERntW6jvTrWIXp_fHibPOWzxfR5Mp7lhoiyz01BCDekEhSzSnNMLK2xYXbJhRSFrJaGEYlxIbUoRVUASw8yq5kuy6qW2tAhuj3kboP_2tnYq43fBZdOKgqM4VIC8KSCg8oEH2OwtdqGptPhR2FQe4IqEVR7gupIMFluDpbGWvtPzrBI9OgfhKFshA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3055179006</pqid></control><display><type>article</type><title>On the Convergence of Block Majorization-Minimization Algorithms on the Grassmann Manifold</title><source>IEEE Electronic Library (IEL)</source><creator>Lopez, Carlos Alejandro ; Riba, Jaume</creator><creatorcontrib>Lopez, Carlos Alejandro ; Riba, Jaume</creatorcontrib><description>The Majorization-Minimization (MM) framework is widely used to derive efficient algorithms for specific problems that require the optimization of a cost function (which can be convex or not). It is based on a sequential optimization of a surrogate function over closed convex sets. A natural extension of this framework incorporates ideas of Block Coordinate Descent (BCD) algorithms into the MM framework, also known as block MM. The rationale behind the block extension is to partition the optimization variables into several independent blocks, to obtain a surrogate for each block, and to optimize the surrogate of each block cyclically. However, known convergence proofs of the block MM are only valid under the assumption that the constraint sets are closed and convex. Hence, the global convergence of the block MM is not ensured for non-convex sets by classical proofs, which is needed in iterative schemes that naturally emerge in a wide range of subspace-based signal processing applications. For this purpose, the aim of this letter is to review the convergence proof of the block MM and extend it for blocks constrained in the Grassmann manifold.</description><identifier>ISSN: 1070-9908</identifier><identifier>EISSN: 1558-2361</identifier><identifier>DOI: 10.1109/LSP.2024.3396660</identifier><identifier>CODEN: ISPLEM</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Algorithms ; Constraints ; Convergence ; Convexity ; Cost function ; geodesically convex optimization ; Grassmann manifold ; Independent variables ; majorization-minimization ; Manifolds ; Manifolds (mathematics) ; Minimization ; Non-convex optimization ; Optimization ; Principal component analysis ; Riemannian optimization ; Signal processing ; Signal processing algorithms</subject><ispartof>IEEE signal processing letters, 2024, Vol.31, p.1314-1318</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c287t-c4226c2b8315ba612e3f1c5ed689849bdc5291149a878b4053395ea5a77bf9ac3</cites><orcidid>0000-0002-5515-8169 ; 0000-0002-2216-2786</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10518081$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,778,782,794,4012,27910,27911,27912,54745</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10518081$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Lopez, Carlos Alejandro</creatorcontrib><creatorcontrib>Riba, Jaume</creatorcontrib><title>On the Convergence of Block Majorization-Minimization Algorithms on the Grassmann Manifold</title><title>IEEE signal processing letters</title><addtitle>LSP</addtitle><description>The Majorization-Minimization (MM) framework is widely used to derive efficient algorithms for specific problems that require the optimization of a cost function (which can be convex or not). It is based on a sequential optimization of a surrogate function over closed convex sets. A natural extension of this framework incorporates ideas of Block Coordinate Descent (BCD) algorithms into the MM framework, also known as block MM. The rationale behind the block extension is to partition the optimization variables into several independent blocks, to obtain a surrogate for each block, and to optimize the surrogate of each block cyclically. However, known convergence proofs of the block MM are only valid under the assumption that the constraint sets are closed and convex. Hence, the global convergence of the block MM is not ensured for non-convex sets by classical proofs, which is needed in iterative schemes that naturally emerge in a wide range of subspace-based signal processing applications. For this purpose, the aim of this letter is to review the convergence proof of the block MM and extend it for blocks constrained in the Grassmann manifold.</description><subject>Algorithms</subject><subject>Constraints</subject><subject>Convergence</subject><subject>Convexity</subject><subject>Cost function</subject><subject>geodesically convex optimization</subject><subject>Grassmann manifold</subject><subject>Independent variables</subject><subject>majorization-minimization</subject><subject>Manifolds</subject><subject>Manifolds (mathematics)</subject><subject>Minimization</subject><subject>Non-convex optimization</subject><subject>Optimization</subject><subject>Principal component analysis</subject><subject>Riemannian optimization</subject><subject>Signal processing</subject><subject>Signal processing algorithms</subject><issn>1070-9908</issn><issn>1558-2361</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkDFPwzAQhS0EEqWwMzBEYk4527Fjj6WCgtSqSMDCYjmu06YkdrFTJPj1uGoHprvTvfdO9yF0jWGEMci72evLiAApRpRKzjmcoAFmTOSEcnyaeighlxLEObqIcQMAAgs2QB8Ll_Vrm028-7ZhZZ2xma-z-9abz2yuNz40v7pvvMvnjWu645CN21Xa9OsuZv4QMA06xk47l1yuqX27vERntW6jvTrWIXp_fHibPOWzxfR5Mp7lhoiyz01BCDekEhSzSnNMLK2xYXbJhRSFrJaGEYlxIbUoRVUASw8yq5kuy6qW2tAhuj3kboP_2tnYq43fBZdOKgqM4VIC8KSCg8oEH2OwtdqGptPhR2FQe4IqEVR7gupIMFluDpbGWvtPzrBI9OgfhKFshA</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Lopez, Carlos Alejandro</creator><creator>Riba, Jaume</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-5515-8169</orcidid><orcidid>https://orcid.org/0000-0002-2216-2786</orcidid></search><sort><creationdate>2024</creationdate><title>On the Convergence of Block Majorization-Minimization Algorithms on the Grassmann Manifold</title><author>Lopez, Carlos Alejandro ; Riba, Jaume</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c287t-c4226c2b8315ba612e3f1c5ed689849bdc5291149a878b4053395ea5a77bf9ac3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Constraints</topic><topic>Convergence</topic><topic>Convexity</topic><topic>Cost function</topic><topic>geodesically convex optimization</topic><topic>Grassmann manifold</topic><topic>Independent variables</topic><topic>majorization-minimization</topic><topic>Manifolds</topic><topic>Manifolds (mathematics)</topic><topic>Minimization</topic><topic>Non-convex optimization</topic><topic>Optimization</topic><topic>Principal component analysis</topic><topic>Riemannian optimization</topic><topic>Signal processing</topic><topic>Signal processing algorithms</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lopez, Carlos Alejandro</creatorcontrib><creatorcontrib>Riba, Jaume</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE signal processing letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Lopez, Carlos Alejandro</au><au>Riba, Jaume</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>On the Convergence of Block Majorization-Minimization Algorithms on the Grassmann Manifold</atitle><jtitle>IEEE signal processing letters</jtitle><stitle>LSP</stitle><date>2024</date><risdate>2024</risdate><volume>31</volume><spage>1314</spage><epage>1318</epage><pages>1314-1318</pages><issn>1070-9908</issn><eissn>1558-2361</eissn><coden>ISPLEM</coden><abstract>The Majorization-Minimization (MM) framework is widely used to derive efficient algorithms for specific problems that require the optimization of a cost function (which can be convex or not). It is based on a sequential optimization of a surrogate function over closed convex sets. A natural extension of this framework incorporates ideas of Block Coordinate Descent (BCD) algorithms into the MM framework, also known as block MM. The rationale behind the block extension is to partition the optimization variables into several independent blocks, to obtain a surrogate for each block, and to optimize the surrogate of each block cyclically. However, known convergence proofs of the block MM are only valid under the assumption that the constraint sets are closed and convex. Hence, the global convergence of the block MM is not ensured for non-convex sets by classical proofs, which is needed in iterative schemes that naturally emerge in a wide range of subspace-based signal processing applications. For this purpose, the aim of this letter is to review the convergence proof of the block MM and extend it for blocks constrained in the Grassmann manifold.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/LSP.2024.3396660</doi><tpages>5</tpages><orcidid>https://orcid.org/0000-0002-5515-8169</orcidid><orcidid>https://orcid.org/0000-0002-2216-2786</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1070-9908
ispartof IEEE signal processing letters, 2024, Vol.31, p.1314-1318
issn 1070-9908
1558-2361
language eng
recordid cdi_proquest_journals_3055179006
source IEEE Electronic Library (IEL)
subjects Algorithms
Constraints
Convergence
Convexity
Cost function
geodesically convex optimization
Grassmann manifold
Independent variables
majorization-minimization
Manifolds
Manifolds (mathematics)
Minimization
Non-convex optimization
Optimization
Principal component analysis
Riemannian optimization
Signal processing
Signal processing algorithms
title On the Convergence of Block Majorization-Minimization Algorithms on the Grassmann Manifold
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-15T23%3A46%3A58IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=On%20the%20Convergence%20of%20Block%20Majorization-Minimization%20Algorithms%20on%20the%20Grassmann%20Manifold&rft.jtitle=IEEE%20signal%20processing%20letters&rft.au=Lopez,%20Carlos%20Alejandro&rft.date=2024&rft.volume=31&rft.spage=1314&rft.epage=1318&rft.pages=1314-1318&rft.issn=1070-9908&rft.eissn=1558-2361&rft.coden=ISPLEM&rft_id=info:doi/10.1109/LSP.2024.3396660&rft_dat=%3Cproquest_RIE%3E3055179006%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3055179006&rft_id=info:pmid/&rft_ieee_id=10518081&rfr_iscdi=true