Local function approximation in evolutionary algorithms for the optimization of costly functions

We develop an approach for the optimization of continuous costly functions that uses a space-filling experimental design and local function approximation to reduce the number of function evaluations in an evolutionary algorithm. Our approach is to estimate the objective function value of an offsprin...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on evolutionary computation 2004-10, Vol.8 (5), p.490-505
Hauptverfasser: Regis, R.G., Shoemaker, C.A.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 505
container_issue 5
container_start_page 490
container_title IEEE transactions on evolutionary computation
container_volume 8
creator Regis, R.G.
Shoemaker, C.A.
description We develop an approach for the optimization of continuous costly functions that uses a space-filling experimental design and local function approximation to reduce the number of function evaluations in an evolutionary algorithm. Our approach is to estimate the objective function value of an offspring by fitting a function approximation model over the k nearest previously evaluated points, where k=(d+1)(d+2)/2 and d is the dimension of the problem. The estimated function values are used to screen offspring to identify the most promising ones for function evaluation. To fit function approximation models, a symmetric Latin hypercube design (SLHD) is used to determine initial points for function evaluation. We compared the performance of an evolution strategy (ES) with local quadratic approximation, an ES with local cubic radial basis function (RBF) interpolation, an ES whose initial parent population comes from an SLHD, and a conventional ES. These algorithms were applied to a twelve-dimensional (12-D) groundwater bioremediation problem involving a complex nonlinear finite-element simulation model. The performances of these algorithms were also compared on the Dixon-Szego test functions and on the ten-dimensional (10-D) Rastrigin and Ackley test functions. All comparisons involve analysis of variance (ANOVA) and the computation of simultaneous confidence intervals. The results indicate that ES algorithms with local approximation were significantly better than conventional ES algorithms and ES algorithms initialized by SLHDs on all Dixon-Szego test functions except for Goldstein-Price. However, for the more difficult 10-D and 12-D functions, only the cubic RBF approach was successful in improving the performance of an ES. Moreover, the results also suggest that the cubic RBF approach is superior to the quadratic approximation approach on all test functions and the difference in performance is statistically significant for all test functions with dimension d/spl ges/4.
doi_str_mv 10.1109/TEVC.2004.835247
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_TEVC_2004_835247</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>1347162</ieee_id><sourcerecordid>28314733</sourcerecordid><originalsourceid>FETCH-LOGICAL-c448t-d808f1af131041f7990ead53f75dc49a4625214d8dd5fe22c930e56f94d09f973</originalsourceid><addsrcrecordid>eNp9kc1r3DAQxU1JoPnovdCLKaTk4s2MJFvSMSxpEljIZVN6U4UsNQpeayvZpZu_PnK9JJBDTjODfvMYvVcUnxEWiCAv1lc_lgsCwBaC1oTxD8URSoYVAGkOcg9CVpyLnx-L45QeAZDVKI-KX6tgdFe6sTeDD32pt9sY_vmN_j_5vrR_QzdOg467Une_Q_TDwyaVLsRyeLBl2A5-459mPrjShDR0uxfBdFocOt0l-2lfT4r771fr5U21uru-XV6uKsOYGKpWgHCoHVIEho5LCVa3NXW8bg2TmjWkJsha0ba1s4QYScHWjZOsBekkpyfFt1k33_9ntGlQG5-M7Trd2zAmRQRFxinN4Pm7IDYcaXaNNBn9-gZ9DGPs8zeUEFIw0ogJghkyMaQUrVPbmP2LO4WgpmjUFI2aolFzNHnlbK-rU3bfRd0bn173GgKSNJi5LzPnrbWvz5TxjNBnGLuYGA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>889842686</pqid></control><display><type>article</type><title>Local function approximation in evolutionary algorithms for the optimization of costly functions</title><source>IEEE Electronic Library (IEL)</source><creator>Regis, R.G. ; Shoemaker, C.A.</creator><creatorcontrib>Regis, R.G. ; Shoemaker, C.A.</creatorcontrib><description>We develop an approach for the optimization of continuous costly functions that uses a space-filling experimental design and local function approximation to reduce the number of function evaluations in an evolutionary algorithm. Our approach is to estimate the objective function value of an offspring by fitting a function approximation model over the k nearest previously evaluated points, where k=(d+1)(d+2)/2 and d is the dimension of the problem. The estimated function values are used to screen offspring to identify the most promising ones for function evaluation. To fit function approximation models, a symmetric Latin hypercube design (SLHD) is used to determine initial points for function evaluation. We compared the performance of an evolution strategy (ES) with local quadratic approximation, an ES with local cubic radial basis function (RBF) interpolation, an ES whose initial parent population comes from an SLHD, and a conventional ES. These algorithms were applied to a twelve-dimensional (12-D) groundwater bioremediation problem involving a complex nonlinear finite-element simulation model. The performances of these algorithms were also compared on the Dixon-Szego test functions and on the ten-dimensional (10-D) Rastrigin and Ackley test functions. All comparisons involve analysis of variance (ANOVA) and the computation of simultaneous confidence intervals. The results indicate that ES algorithms with local approximation were significantly better than conventional ES algorithms and ES algorithms initialized by SLHDs on all Dixon-Szego test functions except for Goldstein-Price. However, for the more difficult 10-D and 12-D functions, only the cubic RBF approach was successful in improving the performance of an ES. Moreover, the results also suggest that the cubic RBF approach is superior to the quadratic approximation approach on all test functions and the difference in performance is statistically significant for all test functions with dimension d/spl ges/4.</description><identifier>ISSN: 1089-778X</identifier><identifier>EISSN: 1941-0026</identifier><identifier>DOI: 10.1109/TEVC.2004.835247</identifier><identifier>CODEN: ITEVF5</identifier><language>eng</language><publisher>New York, NY: IEEE</publisher><subject>Algorithms ; Analysis of variance ; Applied sciences ; Approximation ; Approximation algorithms ; Artificial intelligence ; Computer science; control theory; systems ; Design engineering ; Design for experiments ; Design optimization ; Evolutionary algorithms ; Evolutionary computation ; Exact sciences and technology ; Finite element methods ; Function approximation ; Hypercubes ; Interpolation ; Mathematical analysis ; Mathematical models ; Mathematical programming ; Operational research and scientific management ; Operational research. Management science ; Optimization ; Problem solving, game playing ; Testing</subject><ispartof>IEEE transactions on evolutionary computation, 2004-10, Vol.8 (5), p.490-505</ispartof><rights>2005 INIST-CNRS</rights><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2004</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c448t-d808f1af131041f7990ead53f75dc49a4625214d8dd5fe22c930e56f94d09f973</citedby><cites>FETCH-LOGICAL-c448t-d808f1af131041f7990ead53f75dc49a4625214d8dd5fe22c930e56f94d09f973</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/1347162$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27923,27924,54757</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/1347162$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=16209261$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Regis, R.G.</creatorcontrib><creatorcontrib>Shoemaker, C.A.</creatorcontrib><title>Local function approximation in evolutionary algorithms for the optimization of costly functions</title><title>IEEE transactions on evolutionary computation</title><addtitle>TEVC</addtitle><description>We develop an approach for the optimization of continuous costly functions that uses a space-filling experimental design and local function approximation to reduce the number of function evaluations in an evolutionary algorithm. Our approach is to estimate the objective function value of an offspring by fitting a function approximation model over the k nearest previously evaluated points, where k=(d+1)(d+2)/2 and d is the dimension of the problem. The estimated function values are used to screen offspring to identify the most promising ones for function evaluation. To fit function approximation models, a symmetric Latin hypercube design (SLHD) is used to determine initial points for function evaluation. We compared the performance of an evolution strategy (ES) with local quadratic approximation, an ES with local cubic radial basis function (RBF) interpolation, an ES whose initial parent population comes from an SLHD, and a conventional ES. These algorithms were applied to a twelve-dimensional (12-D) groundwater bioremediation problem involving a complex nonlinear finite-element simulation model. The performances of these algorithms were also compared on the Dixon-Szego test functions and on the ten-dimensional (10-D) Rastrigin and Ackley test functions. All comparisons involve analysis of variance (ANOVA) and the computation of simultaneous confidence intervals. The results indicate that ES algorithms with local approximation were significantly better than conventional ES algorithms and ES algorithms initialized by SLHDs on all Dixon-Szego test functions except for Goldstein-Price. However, for the more difficult 10-D and 12-D functions, only the cubic RBF approach was successful in improving the performance of an ES. Moreover, the results also suggest that the cubic RBF approach is superior to the quadratic approximation approach on all test functions and the difference in performance is statistically significant for all test functions with dimension d/spl ges/4.</description><subject>Algorithms</subject><subject>Analysis of variance</subject><subject>Applied sciences</subject><subject>Approximation</subject><subject>Approximation algorithms</subject><subject>Artificial intelligence</subject><subject>Computer science; control theory; systems</subject><subject>Design engineering</subject><subject>Design for experiments</subject><subject>Design optimization</subject><subject>Evolutionary algorithms</subject><subject>Evolutionary computation</subject><subject>Exact sciences and technology</subject><subject>Finite element methods</subject><subject>Function approximation</subject><subject>Hypercubes</subject><subject>Interpolation</subject><subject>Mathematical analysis</subject><subject>Mathematical models</subject><subject>Mathematical programming</subject><subject>Operational research and scientific management</subject><subject>Operational research. Management science</subject><subject>Optimization</subject><subject>Problem solving, game playing</subject><subject>Testing</subject><issn>1089-778X</issn><issn>1941-0026</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2004</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNp9kc1r3DAQxU1JoPnovdCLKaTk4s2MJFvSMSxpEljIZVN6U4UsNQpeayvZpZu_PnK9JJBDTjODfvMYvVcUnxEWiCAv1lc_lgsCwBaC1oTxD8URSoYVAGkOcg9CVpyLnx-L45QeAZDVKI-KX6tgdFe6sTeDD32pt9sY_vmN_j_5vrR_QzdOg467Une_Q_TDwyaVLsRyeLBl2A5-459mPrjShDR0uxfBdFocOt0l-2lfT4r771fr5U21uru-XV6uKsOYGKpWgHCoHVIEho5LCVa3NXW8bg2TmjWkJsha0ba1s4QYScHWjZOsBekkpyfFt1k33_9ntGlQG5-M7Trd2zAmRQRFxinN4Pm7IDYcaXaNNBn9-gZ9DGPs8zeUEFIw0ogJghkyMaQUrVPbmP2LO4WgpmjUFI2aolFzNHnlbK-rU3bfRd0bn173GgKSNJi5LzPnrbWvz5TxjNBnGLuYGA</recordid><startdate>20041001</startdate><enddate>20041001</enddate><creator>Regis, R.G.</creator><creator>Shoemaker, C.A.</creator><general>IEEE</general><general>Institute of Electrical and Electronics Engineers</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>RIA</scope><scope>RIE</scope><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>F28</scope><scope>FR3</scope></search><sort><creationdate>20041001</creationdate><title>Local function approximation in evolutionary algorithms for the optimization of costly functions</title><author>Regis, R.G. ; Shoemaker, C.A.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c448t-d808f1af131041f7990ead53f75dc49a4625214d8dd5fe22c930e56f94d09f973</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2004</creationdate><topic>Algorithms</topic><topic>Analysis of variance</topic><topic>Applied sciences</topic><topic>Approximation</topic><topic>Approximation algorithms</topic><topic>Artificial intelligence</topic><topic>Computer science; control theory; systems</topic><topic>Design engineering</topic><topic>Design for experiments</topic><topic>Design optimization</topic><topic>Evolutionary algorithms</topic><topic>Evolutionary computation</topic><topic>Exact sciences and technology</topic><topic>Finite element methods</topic><topic>Function approximation</topic><topic>Hypercubes</topic><topic>Interpolation</topic><topic>Mathematical analysis</topic><topic>Mathematical models</topic><topic>Mathematical programming</topic><topic>Operational research and scientific management</topic><topic>Operational research. Management science</topic><topic>Optimization</topic><topic>Problem solving, game playing</topic><topic>Testing</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Regis, R.G.</creatorcontrib><creatorcontrib>Shoemaker, C.A.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><jtitle>IEEE transactions on evolutionary computation</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Regis, R.G.</au><au>Shoemaker, C.A.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Local function approximation in evolutionary algorithms for the optimization of costly functions</atitle><jtitle>IEEE transactions on evolutionary computation</jtitle><stitle>TEVC</stitle><date>2004-10-01</date><risdate>2004</risdate><volume>8</volume><issue>5</issue><spage>490</spage><epage>505</epage><pages>490-505</pages><issn>1089-778X</issn><eissn>1941-0026</eissn><coden>ITEVF5</coden><abstract>We develop an approach for the optimization of continuous costly functions that uses a space-filling experimental design and local function approximation to reduce the number of function evaluations in an evolutionary algorithm. Our approach is to estimate the objective function value of an offspring by fitting a function approximation model over the k nearest previously evaluated points, where k=(d+1)(d+2)/2 and d is the dimension of the problem. The estimated function values are used to screen offspring to identify the most promising ones for function evaluation. To fit function approximation models, a symmetric Latin hypercube design (SLHD) is used to determine initial points for function evaluation. We compared the performance of an evolution strategy (ES) with local quadratic approximation, an ES with local cubic radial basis function (RBF) interpolation, an ES whose initial parent population comes from an SLHD, and a conventional ES. These algorithms were applied to a twelve-dimensional (12-D) groundwater bioremediation problem involving a complex nonlinear finite-element simulation model. The performances of these algorithms were also compared on the Dixon-Szego test functions and on the ten-dimensional (10-D) Rastrigin and Ackley test functions. All comparisons involve analysis of variance (ANOVA) and the computation of simultaneous confidence intervals. The results indicate that ES algorithms with local approximation were significantly better than conventional ES algorithms and ES algorithms initialized by SLHDs on all Dixon-Szego test functions except for Goldstein-Price. However, for the more difficult 10-D and 12-D functions, only the cubic RBF approach was successful in improving the performance of an ES. Moreover, the results also suggest that the cubic RBF approach is superior to the quadratic approximation approach on all test functions and the difference in performance is statistically significant for all test functions with dimension d/spl ges/4.</abstract><cop>New York, NY</cop><pub>IEEE</pub><doi>10.1109/TEVC.2004.835247</doi><tpages>16</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1089-778X
ispartof IEEE transactions on evolutionary computation, 2004-10, Vol.8 (5), p.490-505
issn 1089-778X
1941-0026
language eng
recordid cdi_crossref_primary_10_1109_TEVC_2004_835247
source IEEE Electronic Library (IEL)
subjects Algorithms
Analysis of variance
Applied sciences
Approximation
Approximation algorithms
Artificial intelligence
Computer science
control theory
systems
Design engineering
Design for experiments
Design optimization
Evolutionary algorithms
Evolutionary computation
Exact sciences and technology
Finite element methods
Function approximation
Hypercubes
Interpolation
Mathematical analysis
Mathematical models
Mathematical programming
Operational research and scientific management
Operational research. Management science
Optimization
Problem solving, game playing
Testing
title Local function approximation in evolutionary algorithms for the optimization of costly functions
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T14%3A18%3A57IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Local%20function%20approximation%20in%20evolutionary%20algorithms%20for%20the%20optimization%20of%20costly%20functions&rft.jtitle=IEEE%20transactions%20on%20evolutionary%20computation&rft.au=Regis,%20R.G.&rft.date=2004-10-01&rft.volume=8&rft.issue=5&rft.spage=490&rft.epage=505&rft.pages=490-505&rft.issn=1089-778X&rft.eissn=1941-0026&rft.coden=ITEVF5&rft_id=info:doi/10.1109/TEVC.2004.835247&rft_dat=%3Cproquest_RIE%3E28314733%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=889842686&rft_id=info:pmid/&rft_ieee_id=1347162&rfr_iscdi=true