Benchmarking optimization methods for parameter estimation in large kinetic models

Abstract Motivation Kinetic models contain unknown parameters that are estimated by optimizing the fit to experimental data. This task can be computationally challenging due to the presence of local optima and ill-conditioning. While a variety of optimization methods have been suggested to surmount...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Bioinformatics 2019-03, Vol.35 (5), p.830-838
Hauptverfasser: Villaverde, Alejandro F, Fröhlich, Fabian, Weindl, Daniel, Hasenauer, Jan, Banga, Julio R
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 838
container_issue 5
container_start_page 830
container_title Bioinformatics
container_volume 35
creator Villaverde, Alejandro F
Fröhlich, Fabian
Weindl, Daniel
Hasenauer, Jan
Banga, Julio R
description Abstract Motivation Kinetic models contain unknown parameters that are estimated by optimizing the fit to experimental data. This task can be computationally challenging due to the presence of local optima and ill-conditioning. While a variety of optimization methods have been suggested to surmount these issues, it is difficult to choose the best one for a given problem a priori. A systematic comparison of parameter estimation methods for problems with tens to hundreds of optimization variables is currently missing, and smaller studies provided contradictory findings. Results We use a collection of benchmarks to evaluate the performance of two families of optimization methods: (i) multi-starts of deterministic local searches and (ii) stochastic global optimization metaheuristics; the latter may be combined with deterministic local searches, leading to hybrid methods. A fair comparison is ensured through a collaborative evaluation and a consideration of multiple performance metrics. We discuss possible evaluation criteria to assess the trade-off between computational efficiency and robustness. Our results show that, thanks to recent advances in the calculation of parametric sensitivities, a multi-start of gradient-based local methods is often a successful strategy, but a better performance can be obtained with a hybrid metaheuristic. The best performer combines a global scatter search metaheuristic with an interior point local method, provided with gradients estimated with adjoint-based sensitivities. We provide an implementation of this method to render it available to the scientific community. Availability and implementation The code to reproduce the results is provided as Supplementary Material and is available at Zenodo https://doi.org/10.5281/zenodo.1304034. Supplementary information Supplementary data are available at Bioinformatics online.
doi_str_mv 10.1093/bioinformatics/bty736
format Article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_6394396</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><oup_id>10.1093/bioinformatics/bty736</oup_id><sourcerecordid>2887604683</sourcerecordid><originalsourceid>FETCH-LOGICAL-c551t-faf575a4157c75407256750282de4331c54afb73862a13ca830470a601a1522b3</originalsourceid><addsrcrecordid>eNqNkV1LwzAYhYMoTqc_QemlN3X5Tnsj6PALBoLodUjTdIu2TU1aYf56I53D3XmVhPc55z3hAHCG4CWCOZkV1tm2cr5RvdVhVvRrQfgeOEKUwxRDlu_HO-EipRkkE3AcwhuEDFFKD8GEwAzxHOdH4PnGtHrVKP9u22Xiut429itaujZpTL9yZUjikqRTXsW38YkJERkB2ya18kuTRK2JKZLGlaYOJ-CgUnUwp5tzCl7vbl_mD-ni6f5xfr1INWOoTytVMcEURUxowSgUmHHBIM5waSghSDOqqkKQjGOFiFYZgVRAxSFSiGFckCm4Gn27oWhMqU3be1XLzsd8fi2dsnJ30tqVXLpPyUlOSc6jwcXGwLuPIX5MNjZoU9eqNW4IEmeZ4JDyjESUjaj2LgRvqu0aBOVPH3K3Dzn2EXXnfzNuVb8FRACOgBu6f3p-A4UEn2I</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2887604683</pqid></control><display><type>article</type><title>Benchmarking optimization methods for parameter estimation in large kinetic models</title><source>MEDLINE</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>Oxford Journals Open Access Collection</source><source>PubMed Central</source><source>Alma/SFX Local Collection</source><creator>Villaverde, Alejandro F ; Fröhlich, Fabian ; Weindl, Daniel ; Hasenauer, Jan ; Banga, Julio R</creator><contributor>Stegle, Oliver</contributor><creatorcontrib>Villaverde, Alejandro F ; Fröhlich, Fabian ; Weindl, Daniel ; Hasenauer, Jan ; Banga, Julio R ; Stegle, Oliver</creatorcontrib><description>Abstract Motivation Kinetic models contain unknown parameters that are estimated by optimizing the fit to experimental data. This task can be computationally challenging due to the presence of local optima and ill-conditioning. While a variety of optimization methods have been suggested to surmount these issues, it is difficult to choose the best one for a given problem a priori. A systematic comparison of parameter estimation methods for problems with tens to hundreds of optimization variables is currently missing, and smaller studies provided contradictory findings. Results We use a collection of benchmarks to evaluate the performance of two families of optimization methods: (i) multi-starts of deterministic local searches and (ii) stochastic global optimization metaheuristics; the latter may be combined with deterministic local searches, leading to hybrid methods. A fair comparison is ensured through a collaborative evaluation and a consideration of multiple performance metrics. We discuss possible evaluation criteria to assess the trade-off between computational efficiency and robustness. Our results show that, thanks to recent advances in the calculation of parametric sensitivities, a multi-start of gradient-based local methods is often a successful strategy, but a better performance can be obtained with a hybrid metaheuristic. The best performer combines a global scatter search metaheuristic with an interior point local method, provided with gradients estimated with adjoint-based sensitivities. We provide an implementation of this method to render it available to the scientific community. Availability and implementation The code to reproduce the results is provided as Supplementary Material and is available at Zenodo https://doi.org/10.5281/zenodo.1304034. Supplementary information Supplementary data are available at Bioinformatics online.</description><identifier>ISSN: 1367-4803</identifier><identifier>ISSN: 1460-2059</identifier><identifier>EISSN: 1460-2059</identifier><identifier>EISSN: 1367-4811</identifier><identifier>DOI: 10.1093/bioinformatics/bty736</identifier><identifier>PMID: 30816929</identifier><language>eng</language><publisher>England: Oxford University Press</publisher><subject>Algorithms ; Benchmarking ; bioinformatics ; collaborative testing ; hybrids ; Kinetics ; Models, Biological ; Original Papers ; Software</subject><ispartof>Bioinformatics, 2019-03, Vol.35 (5), p.830-838</ispartof><rights>The Author(s) 2018. Published by Oxford University Press. 2018</rights><rights>The Author(s) 2018. Published by Oxford University Press.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c551t-faf575a4157c75407256750282de4331c54afb73862a13ca830470a601a1522b3</citedby><cites>FETCH-LOGICAL-c551t-faf575a4157c75407256750282de4331c54afb73862a13ca830470a601a1522b3</cites><orcidid>0000-0002-4245-0320 ; 0000-0001-7401-7380 ; 0000-0002-4935-3312</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6394396/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6394396/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,723,776,780,881,1598,27903,27904,53770,53772</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/30816929$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><contributor>Stegle, Oliver</contributor><creatorcontrib>Villaverde, Alejandro F</creatorcontrib><creatorcontrib>Fröhlich, Fabian</creatorcontrib><creatorcontrib>Weindl, Daniel</creatorcontrib><creatorcontrib>Hasenauer, Jan</creatorcontrib><creatorcontrib>Banga, Julio R</creatorcontrib><title>Benchmarking optimization methods for parameter estimation in large kinetic models</title><title>Bioinformatics</title><addtitle>Bioinformatics</addtitle><description>Abstract Motivation Kinetic models contain unknown parameters that are estimated by optimizing the fit to experimental data. This task can be computationally challenging due to the presence of local optima and ill-conditioning. While a variety of optimization methods have been suggested to surmount these issues, it is difficult to choose the best one for a given problem a priori. A systematic comparison of parameter estimation methods for problems with tens to hundreds of optimization variables is currently missing, and smaller studies provided contradictory findings. Results We use a collection of benchmarks to evaluate the performance of two families of optimization methods: (i) multi-starts of deterministic local searches and (ii) stochastic global optimization metaheuristics; the latter may be combined with deterministic local searches, leading to hybrid methods. A fair comparison is ensured through a collaborative evaluation and a consideration of multiple performance metrics. We discuss possible evaluation criteria to assess the trade-off between computational efficiency and robustness. Our results show that, thanks to recent advances in the calculation of parametric sensitivities, a multi-start of gradient-based local methods is often a successful strategy, but a better performance can be obtained with a hybrid metaheuristic. The best performer combines a global scatter search metaheuristic with an interior point local method, provided with gradients estimated with adjoint-based sensitivities. We provide an implementation of this method to render it available to the scientific community. Availability and implementation The code to reproduce the results is provided as Supplementary Material and is available at Zenodo https://doi.org/10.5281/zenodo.1304034. Supplementary information Supplementary data are available at Bioinformatics online.</description><subject>Algorithms</subject><subject>Benchmarking</subject><subject>bioinformatics</subject><subject>collaborative testing</subject><subject>hybrids</subject><subject>Kinetics</subject><subject>Models, Biological</subject><subject>Original Papers</subject><subject>Software</subject><issn>1367-4803</issn><issn>1460-2059</issn><issn>1460-2059</issn><issn>1367-4811</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>TOX</sourceid><sourceid>EIF</sourceid><recordid>eNqNkV1LwzAYhYMoTqc_QemlN3X5Tnsj6PALBoLodUjTdIu2TU1aYf56I53D3XmVhPc55z3hAHCG4CWCOZkV1tm2cr5RvdVhVvRrQfgeOEKUwxRDlu_HO-EipRkkE3AcwhuEDFFKD8GEwAzxHOdH4PnGtHrVKP9u22Xiut429itaujZpTL9yZUjikqRTXsW38YkJERkB2ya18kuTRK2JKZLGlaYOJ-CgUnUwp5tzCl7vbl_mD-ni6f5xfr1INWOoTytVMcEURUxowSgUmHHBIM5waSghSDOqqkKQjGOFiFYZgVRAxSFSiGFckCm4Gn27oWhMqU3be1XLzsd8fi2dsnJ30tqVXLpPyUlOSc6jwcXGwLuPIX5MNjZoU9eqNW4IEmeZ4JDyjESUjaj2LgRvqu0aBOVPH3K3Dzn2EXXnfzNuVb8FRACOgBu6f3p-A4UEn2I</recordid><startdate>20190301</startdate><enddate>20190301</enddate><creator>Villaverde, Alejandro F</creator><creator>Fröhlich, Fabian</creator><creator>Weindl, Daniel</creator><creator>Hasenauer, Jan</creator><creator>Banga, Julio R</creator><general>Oxford University Press</general><scope>TOX</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7S9</scope><scope>L.6</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0002-4245-0320</orcidid><orcidid>https://orcid.org/0000-0001-7401-7380</orcidid><orcidid>https://orcid.org/0000-0002-4935-3312</orcidid></search><sort><creationdate>20190301</creationdate><title>Benchmarking optimization methods for parameter estimation in large kinetic models</title><author>Villaverde, Alejandro F ; Fröhlich, Fabian ; Weindl, Daniel ; Hasenauer, Jan ; Banga, Julio R</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c551t-faf575a4157c75407256750282de4331c54afb73862a13ca830470a601a1522b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Algorithms</topic><topic>Benchmarking</topic><topic>bioinformatics</topic><topic>collaborative testing</topic><topic>hybrids</topic><topic>Kinetics</topic><topic>Models, Biological</topic><topic>Original Papers</topic><topic>Software</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Villaverde, Alejandro F</creatorcontrib><creatorcontrib>Fröhlich, Fabian</creatorcontrib><creatorcontrib>Weindl, Daniel</creatorcontrib><creatorcontrib>Hasenauer, Jan</creatorcontrib><creatorcontrib>Banga, Julio R</creatorcontrib><collection>Oxford Journals Open Access Collection</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>AGRICOLA</collection><collection>AGRICOLA - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Bioinformatics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Villaverde, Alejandro F</au><au>Fröhlich, Fabian</au><au>Weindl, Daniel</au><au>Hasenauer, Jan</au><au>Banga, Julio R</au><au>Stegle, Oliver</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Benchmarking optimization methods for parameter estimation in large kinetic models</atitle><jtitle>Bioinformatics</jtitle><addtitle>Bioinformatics</addtitle><date>2019-03-01</date><risdate>2019</risdate><volume>35</volume><issue>5</issue><spage>830</spage><epage>838</epage><pages>830-838</pages><issn>1367-4803</issn><issn>1460-2059</issn><eissn>1460-2059</eissn><eissn>1367-4811</eissn><abstract>Abstract Motivation Kinetic models contain unknown parameters that are estimated by optimizing the fit to experimental data. This task can be computationally challenging due to the presence of local optima and ill-conditioning. While a variety of optimization methods have been suggested to surmount these issues, it is difficult to choose the best one for a given problem a priori. A systematic comparison of parameter estimation methods for problems with tens to hundreds of optimization variables is currently missing, and smaller studies provided contradictory findings. Results We use a collection of benchmarks to evaluate the performance of two families of optimization methods: (i) multi-starts of deterministic local searches and (ii) stochastic global optimization metaheuristics; the latter may be combined with deterministic local searches, leading to hybrid methods. A fair comparison is ensured through a collaborative evaluation and a consideration of multiple performance metrics. We discuss possible evaluation criteria to assess the trade-off between computational efficiency and robustness. Our results show that, thanks to recent advances in the calculation of parametric sensitivities, a multi-start of gradient-based local methods is often a successful strategy, but a better performance can be obtained with a hybrid metaheuristic. The best performer combines a global scatter search metaheuristic with an interior point local method, provided with gradients estimated with adjoint-based sensitivities. We provide an implementation of this method to render it available to the scientific community. Availability and implementation The code to reproduce the results is provided as Supplementary Material and is available at Zenodo https://doi.org/10.5281/zenodo.1304034. Supplementary information Supplementary data are available at Bioinformatics online.</abstract><cop>England</cop><pub>Oxford University Press</pub><pmid>30816929</pmid><doi>10.1093/bioinformatics/bty736</doi><tpages>9</tpages><orcidid>https://orcid.org/0000-0002-4245-0320</orcidid><orcidid>https://orcid.org/0000-0001-7401-7380</orcidid><orcidid>https://orcid.org/0000-0002-4935-3312</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1367-4803
ispartof Bioinformatics, 2019-03, Vol.35 (5), p.830-838
issn 1367-4803
1460-2059
1460-2059
1367-4811
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_6394396
source MEDLINE; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; Oxford Journals Open Access Collection; PubMed Central; Alma/SFX Local Collection
subjects Algorithms
Benchmarking
bioinformatics
collaborative testing
hybrids
Kinetics
Models, Biological
Original Papers
Software
title Benchmarking optimization methods for parameter estimation in large kinetic models
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-23T03%3A48%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Benchmarking%20optimization%20methods%20for%20parameter%20estimation%20in%20large%20kinetic%20models&rft.jtitle=Bioinformatics&rft.au=Villaverde,%20Alejandro%20F&rft.date=2019-03-01&rft.volume=35&rft.issue=5&rft.spage=830&rft.epage=838&rft.pages=830-838&rft.issn=1367-4803&rft.eissn=1460-2059&rft_id=info:doi/10.1093/bioinformatics/bty736&rft_dat=%3Cproquest_pubme%3E2887604683%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2887604683&rft_id=info:pmid/30816929&rft_oup_id=10.1093/bioinformatics/bty736&rfr_iscdi=true