Hyperbolic Neural Network Based Preselection for Expensive Multi-Objective Optimization

A series of surrogate-assisted evolutionary algorithms (SAEAs) have been proposed for expensive multi-objective optimization problems (EMOPs), building cheap surrogate models to replace the expensive real function evaluations. However, the search efficiency of these SAEAs is not yet satisfactory. Mo...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on evolutionary computation 2024-06, p.1-1
Hauptverfasser: Li, Bingdong, Yang, Yanting, Hong, Wenjing, Yang, Peng, Zhou, Aimin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1
container_issue
container_start_page 1
container_title IEEE transactions on evolutionary computation
container_volume
creator Li, Bingdong
Yang, Yanting
Hong, Wenjing
Yang, Peng
Zhou, Aimin
description A series of surrogate-assisted evolutionary algorithms (SAEAs) have been proposed for expensive multi-objective optimization problems (EMOPs), building cheap surrogate models to replace the expensive real function evaluations. However, the search efficiency of these SAEAs is not yet satisfactory. More efforts are needed to further exploit useful information from the real function evaluations in order to better guide the search process. Facing this challenge, this paper proposes a Hyperbolic Neural Network (HNN) based preselection operator to accelerate the optimization process based on limited evaluated solutions. First, the preselection task is modeled as a multi-label classification problem where solutions are classified into different layers (ordinal categories) through -relaxed objective aggregation. Second, in order to resemble the hierarchical structure of candidate solutions, a hyperbolic neural network is applied to tackle the multi-label classification problem. The reason for using HNN is that hyperbolic spaces more closely resemble hierarchical structures than Euclidean spaces. Moreover, to alleviate the data deficiency issue, a data augmentation strategy is employed for training the HNN. In order to evaluate its performance, the proposed HNN-based preselection operator is embedded into two surrogate-assisted evolutionary algorithms. Experimental results on two benchmark test suites and three real-world problems with up to 11 objectives and 150 decision variables involving seven state-of-the-art algorithms demonstrate the effectiveness of the proposed method.
doi_str_mv 10.1109/TEVC.2024.3409431
format Article
fullrecord <record><control><sourceid>crossref_RIE</sourceid><recordid>TN_cdi_ieee_primary_10547541</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10547541</ieee_id><sourcerecordid>10_1109_TEVC_2024_3409431</sourcerecordid><originalsourceid>FETCH-LOGICAL-c631-d82b35d400067ad252e057c575905f441ca3091e1b2b9040265df66274e55f973</originalsourceid><addsrcrecordid>eNpNkE9Lw0AQxRdRsFY_gOAhXyBxZv9ku0ct1QrVeijqLWySCWxNm7CbqvXTm9AePL0Z5r2B92PsGiFBBHO7mr1NEw5cJkKCkQJP2AiNxBiAp6f9DBMTaz35OGcXIawBUCo0I_Y-37fk86Z2RfRCO2_rXrrvxn9G9zZQGb16ClRT0blmG1WNj2Y_LW2D-6LoeVd3Ll7m6-Ha78u2cxv3awfrJTurbB3o6qhjtnqYrabzeLF8fJreLeIiFRiXE54LVUoASLUtueIEShdKKwOqkhILK8AgYc5zA7KvosoqTbmWpFRltBgzPLwtfBOCpyprvdtYv88QsgFMNoDJBjDZEUyfuTlkHBH98yuplUTxBx8XX2U</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Hyperbolic Neural Network Based Preselection for Expensive Multi-Objective Optimization</title><source>IEEE Electronic Library (IEL)</source><creator>Li, Bingdong ; Yang, Yanting ; Hong, Wenjing ; Yang, Peng ; Zhou, Aimin</creator><creatorcontrib>Li, Bingdong ; Yang, Yanting ; Hong, Wenjing ; Yang, Peng ; Zhou, Aimin</creatorcontrib><description>A series of surrogate-assisted evolutionary algorithms (SAEAs) have been proposed for expensive multi-objective optimization problems (EMOPs), building cheap surrogate models to replace the expensive real function evaluations. However, the search efficiency of these SAEAs is not yet satisfactory. More efforts are needed to further exploit useful information from the real function evaluations in order to better guide the search process. Facing this challenge, this paper proposes a Hyperbolic Neural Network (HNN) based preselection operator to accelerate the optimization process based on limited evaluated solutions. First, the preselection task is modeled as a multi-label classification problem where solutions are classified into different layers (ordinal categories) through -relaxed objective aggregation. Second, in order to resemble the hierarchical structure of candidate solutions, a hyperbolic neural network is applied to tackle the multi-label classification problem. The reason for using HNN is that hyperbolic spaces more closely resemble hierarchical structures than Euclidean spaces. Moreover, to alleviate the data deficiency issue, a data augmentation strategy is employed for training the HNN. In order to evaluate its performance, the proposed HNN-based preselection operator is embedded into two surrogate-assisted evolutionary algorithms. Experimental results on two benchmark test suites and three real-world problems with up to 11 objectives and 150 decision variables involving seven state-of-the-art algorithms demonstrate the effectiveness of the proposed method.</description><identifier>ISSN: 1089-778X</identifier><identifier>EISSN: 1941-0026</identifier><identifier>DOI: 10.1109/TEVC.2024.3409431</identifier><identifier>CODEN: ITEVF5</identifier><language>eng</language><publisher>IEEE</publisher><subject>Adaptation models ; Computational modeling ; Evolutionary computation ; expensive optimization ; Hyperbolic neural network ; multi-objective optimization ; Neural networks ; Optimization ; Predictive models ; preselection operator ; surrogate-assisted evolutionary algorithm ; Vectors</subject><ispartof>IEEE transactions on evolutionary computation, 2024-06, p.1-1</ispartof><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><orcidid>0000-0001-5333-6155 ; 0000-0002-1742-2766 ; 0000-0001-9054-5714 ; 0000-0002-4768-5946</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10547541$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10547541$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Li, Bingdong</creatorcontrib><creatorcontrib>Yang, Yanting</creatorcontrib><creatorcontrib>Hong, Wenjing</creatorcontrib><creatorcontrib>Yang, Peng</creatorcontrib><creatorcontrib>Zhou, Aimin</creatorcontrib><title>Hyperbolic Neural Network Based Preselection for Expensive Multi-Objective Optimization</title><title>IEEE transactions on evolutionary computation</title><addtitle>TEVC</addtitle><description>A series of surrogate-assisted evolutionary algorithms (SAEAs) have been proposed for expensive multi-objective optimization problems (EMOPs), building cheap surrogate models to replace the expensive real function evaluations. However, the search efficiency of these SAEAs is not yet satisfactory. More efforts are needed to further exploit useful information from the real function evaluations in order to better guide the search process. Facing this challenge, this paper proposes a Hyperbolic Neural Network (HNN) based preselection operator to accelerate the optimization process based on limited evaluated solutions. First, the preselection task is modeled as a multi-label classification problem where solutions are classified into different layers (ordinal categories) through -relaxed objective aggregation. Second, in order to resemble the hierarchical structure of candidate solutions, a hyperbolic neural network is applied to tackle the multi-label classification problem. The reason for using HNN is that hyperbolic spaces more closely resemble hierarchical structures than Euclidean spaces. Moreover, to alleviate the data deficiency issue, a data augmentation strategy is employed for training the HNN. In order to evaluate its performance, the proposed HNN-based preselection operator is embedded into two surrogate-assisted evolutionary algorithms. Experimental results on two benchmark test suites and three real-world problems with up to 11 objectives and 150 decision variables involving seven state-of-the-art algorithms demonstrate the effectiveness of the proposed method.</description><subject>Adaptation models</subject><subject>Computational modeling</subject><subject>Evolutionary computation</subject><subject>expensive optimization</subject><subject>Hyperbolic neural network</subject><subject>multi-objective optimization</subject><subject>Neural networks</subject><subject>Optimization</subject><subject>Predictive models</subject><subject>preselection operator</subject><subject>surrogate-assisted evolutionary algorithm</subject><subject>Vectors</subject><issn>1089-778X</issn><issn>1941-0026</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkE9Lw0AQxRdRsFY_gOAhXyBxZv9ku0ct1QrVeijqLWySCWxNm7CbqvXTm9AePL0Z5r2B92PsGiFBBHO7mr1NEw5cJkKCkQJP2AiNxBiAp6f9DBMTaz35OGcXIawBUCo0I_Y-37fk86Z2RfRCO2_rXrrvxn9G9zZQGb16ClRT0blmG1WNj2Y_LW2D-6LoeVd3Ll7m6-Ha78u2cxv3awfrJTurbB3o6qhjtnqYrabzeLF8fJreLeIiFRiXE54LVUoASLUtueIEShdKKwOqkhILK8AgYc5zA7KvosoqTbmWpFRltBgzPLwtfBOCpyprvdtYv88QsgFMNoDJBjDZEUyfuTlkHBH98yuplUTxBx8XX2U</recordid><startdate>20240603</startdate><enddate>20240603</enddate><creator>Li, Bingdong</creator><creator>Yang, Yanting</creator><creator>Hong, Wenjing</creator><creator>Yang, Peng</creator><creator>Zhou, Aimin</creator><general>IEEE</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0001-5333-6155</orcidid><orcidid>https://orcid.org/0000-0002-1742-2766</orcidid><orcidid>https://orcid.org/0000-0001-9054-5714</orcidid><orcidid>https://orcid.org/0000-0002-4768-5946</orcidid></search><sort><creationdate>20240603</creationdate><title>Hyperbolic Neural Network Based Preselection for Expensive Multi-Objective Optimization</title><author>Li, Bingdong ; Yang, Yanting ; Hong, Wenjing ; Yang, Peng ; Zhou, Aimin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c631-d82b35d400067ad252e057c575905f441ca3091e1b2b9040265df66274e55f973</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Adaptation models</topic><topic>Computational modeling</topic><topic>Evolutionary computation</topic><topic>expensive optimization</topic><topic>Hyperbolic neural network</topic><topic>multi-objective optimization</topic><topic>Neural networks</topic><topic>Optimization</topic><topic>Predictive models</topic><topic>preselection operator</topic><topic>surrogate-assisted evolutionary algorithm</topic><topic>Vectors</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Bingdong</creatorcontrib><creatorcontrib>Yang, Yanting</creatorcontrib><creatorcontrib>Hong, Wenjing</creatorcontrib><creatorcontrib>Yang, Peng</creatorcontrib><creatorcontrib>Zhou, Aimin</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><jtitle>IEEE transactions on evolutionary computation</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Li, Bingdong</au><au>Yang, Yanting</au><au>Hong, Wenjing</au><au>Yang, Peng</au><au>Zhou, Aimin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Hyperbolic Neural Network Based Preselection for Expensive Multi-Objective Optimization</atitle><jtitle>IEEE transactions on evolutionary computation</jtitle><stitle>TEVC</stitle><date>2024-06-03</date><risdate>2024</risdate><spage>1</spage><epage>1</epage><pages>1-1</pages><issn>1089-778X</issn><eissn>1941-0026</eissn><coden>ITEVF5</coden><abstract>A series of surrogate-assisted evolutionary algorithms (SAEAs) have been proposed for expensive multi-objective optimization problems (EMOPs), building cheap surrogate models to replace the expensive real function evaluations. However, the search efficiency of these SAEAs is not yet satisfactory. More efforts are needed to further exploit useful information from the real function evaluations in order to better guide the search process. Facing this challenge, this paper proposes a Hyperbolic Neural Network (HNN) based preselection operator to accelerate the optimization process based on limited evaluated solutions. First, the preselection task is modeled as a multi-label classification problem where solutions are classified into different layers (ordinal categories) through -relaxed objective aggregation. Second, in order to resemble the hierarchical structure of candidate solutions, a hyperbolic neural network is applied to tackle the multi-label classification problem. The reason for using HNN is that hyperbolic spaces more closely resemble hierarchical structures than Euclidean spaces. Moreover, to alleviate the data deficiency issue, a data augmentation strategy is employed for training the HNN. In order to evaluate its performance, the proposed HNN-based preselection operator is embedded into two surrogate-assisted evolutionary algorithms. Experimental results on two benchmark test suites and three real-world problems with up to 11 objectives and 150 decision variables involving seven state-of-the-art algorithms demonstrate the effectiveness of the proposed method.</abstract><pub>IEEE</pub><doi>10.1109/TEVC.2024.3409431</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0001-5333-6155</orcidid><orcidid>https://orcid.org/0000-0002-1742-2766</orcidid><orcidid>https://orcid.org/0000-0001-9054-5714</orcidid><orcidid>https://orcid.org/0000-0002-4768-5946</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1089-778X
ispartof IEEE transactions on evolutionary computation, 2024-06, p.1-1
issn 1089-778X
1941-0026
language eng
recordid cdi_ieee_primary_10547541
source IEEE Electronic Library (IEL)
subjects Adaptation models
Computational modeling
Evolutionary computation
expensive optimization
Hyperbolic neural network
multi-objective optimization
Neural networks
Optimization
Predictive models
preselection operator
surrogate-assisted evolutionary algorithm
Vectors
title Hyperbolic Neural Network Based Preselection for Expensive Multi-Objective Optimization
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-02T23%3A49%3A00IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-crossref_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Hyperbolic%20Neural%20Network%20Based%20Preselection%20for%20Expensive%20Multi-Objective%20Optimization&rft.jtitle=IEEE%20transactions%20on%20evolutionary%20computation&rft.au=Li,%20Bingdong&rft.date=2024-06-03&rft.spage=1&rft.epage=1&rft.pages=1-1&rft.issn=1089-778X&rft.eissn=1941-0026&rft.coden=ITEVF5&rft_id=info:doi/10.1109/TEVC.2024.3409431&rft_dat=%3Ccrossref_RIE%3E10_1109_TEVC_2024_3409431%3C/crossref_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10547541&rfr_iscdi=true