A Fortran-Keras Deep Learning Bridge for Scientific Computing

Implementing artificial neural networks is commonly achieved via high-level programming languages such as Python and easy-to-use deep learning libraries such as Keras. These software libraries come preloaded with a variety of network architectures, provide autodifferentiation, and support GPUs for f...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Scientific programming 2020, Vol.2020 (2020), p.1-13
Hauptverfasser: Curcic, Milan, Linstead, Erik, Best, Natalie, Pritchard, Mike, Ott, Jordan, Baldi, Pierre
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 13
container_issue 2020
container_start_page 1
container_title Scientific programming
container_volume 2020
creator Curcic, Milan
Linstead, Erik
Best, Natalie
Pritchard, Mike
Ott, Jordan
Baldi, Pierre
description Implementing artificial neural networks is commonly achieved via high-level programming languages such as Python and easy-to-use deep learning libraries such as Keras. These software libraries come preloaded with a variety of network architectures, provide autodifferentiation, and support GPUs for fast and efficient computation. As a result, a deep learning practitioner will favor training a neural network model in Python, where these tools are readily available. However, many large-scale scientific computation projects are written in Fortran, making it difficult to integrate with modern deep learning methods. To alleviate this problem, we introduce a software library, the Fortran-Keras Bridge (FKB). This two-way bridge connects environments where deep learning resources are plentiful with those where they are scarce. The paper describes several unique features offered by FKB, such as customizable layers, loss functions, and network ensembles. The paper concludes with a case study that applies FKB to address open questions about the robustness of an experimental approach to global climate simulation, in which subgrid physics are outsourced to deep neural network emulators. In this context, FKB enables a hyperparameter search of one hundred plus candidate models of subgrid cloud and radiation physics, initially implemented in Keras, to be transferred and used in Fortran. Such a process allows the model’s emergent behavior to be assessed, i.e., when fit imperfections are coupled to explicit planetary-scale fluid dynamics. The results reveal a previously unrecognized strong relationship between offline validation error and online performance, in which the choice of the optimizer proves unexpectedly critical. This in turn reveals many new neural network architectures that produce considerable improvements in climate model stability including some with reduced error, for an especially challenging training dataset.
doi_str_mv 10.1155/2020/8888811
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2440434382</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2440434382</sourcerecordid><originalsourceid>FETCH-LOGICAL-c426t-17c2e1461ae7c133e4bdc0830846335348f8e522d576bbe51da8e7bb8d81263d3</originalsourceid><addsrcrecordid>eNqF0M9LwzAUB_AgCs7pzbMEPGpdXn606cHDnE7FgQcVvJU0eZ0Zrq1ph_jfm9GBR3N5gffhPd6XkFNgVwBKTTjjbKK3D2CPjEBnKskhf9-Pf6Z0knMpD8lR160YAw2Mjcj1lM6b0AdTJ08YTEdvEVu6QBNqXy_pTfBuibRqAn2xHuveV97SWbNuN33sH5ODynx2eLKrY_I2v3udPSSL5_vH2XSRWMnTPoHMcgSZgsHMghAoS2eZFkzLVAglpK40Ks6dytKyRAXOaMzKUjsNPBVOjMn5MLcNzdcGu75YNZtQx5VFPIlJIYXmUV0Oyoam6wJWRRv82oSfAlixDajYBlTsAor8YuAfvnbm2_-nzwaN0WBl_jRnuYjgF-e1bMU</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2440434382</pqid></control><display><type>article</type><title>A Fortran-Keras Deep Learning Bridge for Scientific Computing</title><source>Wiley Online Library Open Access</source><source>EZB-FREE-00999 freely available EZB journals</source><source>Alma/SFX Local Collection</source><creator>Curcic, Milan ; Linstead, Erik ; Best, Natalie ; Pritchard, Mike ; Ott, Jordan ; Baldi, Pierre</creator><contributor>Acacio Sanchez, Manuel E.</contributor><creatorcontrib>Curcic, Milan ; Linstead, Erik ; Best, Natalie ; Pritchard, Mike ; Ott, Jordan ; Baldi, Pierre ; Acacio Sanchez, Manuel E.</creatorcontrib><description>Implementing artificial neural networks is commonly achieved via high-level programming languages such as Python and easy-to-use deep learning libraries such as Keras. These software libraries come preloaded with a variety of network architectures, provide autodifferentiation, and support GPUs for fast and efficient computation. As a result, a deep learning practitioner will favor training a neural network model in Python, where these tools are readily available. However, many large-scale scientific computation projects are written in Fortran, making it difficult to integrate with modern deep learning methods. To alleviate this problem, we introduce a software library, the Fortran-Keras Bridge (FKB). This two-way bridge connects environments where deep learning resources are plentiful with those where they are scarce. The paper describes several unique features offered by FKB, such as customizable layers, loss functions, and network ensembles. The paper concludes with a case study that applies FKB to address open questions about the robustness of an experimental approach to global climate simulation, in which subgrid physics are outsourced to deep neural network emulators. In this context, FKB enables a hyperparameter search of one hundred plus candidate models of subgrid cloud and radiation physics, initially implemented in Keras, to be transferred and used in Fortran. Such a process allows the model’s emergent behavior to be assessed, i.e., when fit imperfections are coupled to explicit planetary-scale fluid dynamics. The results reveal a previously unrecognized strong relationship between offline validation error and online performance, in which the choice of the optimizer proves unexpectedly critical. This in turn reveals many new neural network architectures that produce considerable improvements in climate model stability including some with reduced error, for an especially challenging training dataset.</description><identifier>ISSN: 1058-9244</identifier><identifier>EISSN: 1875-919X</identifier><identifier>DOI: 10.1155/2020/8888811</identifier><language>eng</language><publisher>Cairo, Egypt: Hindawi Publishing Corporation</publisher><subject>Application programming interface ; Artificial neural networks ; Chemistry ; Climate models ; Computational fluid dynamics ; Computer architecture ; Computer simulation ; Deep learning ; Dosimetry ; Earthquakes ; Emulators ; Error reduction ; Fluid dynamics ; FORTRAN ; High level languages ; Laboratories ; Libraries ; Machine learning ; Mechanics ; Molecular physics ; Neural networks ; Partial differential equations ; Physics ; Popularity ; Programming languages ; Simulation ; Software ; Training ; Weather forecasting</subject><ispartof>Scientific programming, 2020, Vol.2020 (2020), p.1-13</ispartof><rights>Copyright © 2020 Jordan Ott et al.</rights><rights>Copyright © 2020 Jordan Ott et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. http://creativecommons.org/licenses/by/4.0</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c426t-17c2e1461ae7c133e4bdc0830846335348f8e522d576bbe51da8e7bb8d81263d3</citedby><cites>FETCH-LOGICAL-c426t-17c2e1461ae7c133e4bdc0830846335348f8e522d576bbe51da8e7bb8d81263d3</cites><orcidid>0000-0001-8752-4664</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,4010,27900,27901,27902</link.rule.ids></links><search><contributor>Acacio Sanchez, Manuel E.</contributor><creatorcontrib>Curcic, Milan</creatorcontrib><creatorcontrib>Linstead, Erik</creatorcontrib><creatorcontrib>Best, Natalie</creatorcontrib><creatorcontrib>Pritchard, Mike</creatorcontrib><creatorcontrib>Ott, Jordan</creatorcontrib><creatorcontrib>Baldi, Pierre</creatorcontrib><title>A Fortran-Keras Deep Learning Bridge for Scientific Computing</title><title>Scientific programming</title><description>Implementing artificial neural networks is commonly achieved via high-level programming languages such as Python and easy-to-use deep learning libraries such as Keras. These software libraries come preloaded with a variety of network architectures, provide autodifferentiation, and support GPUs for fast and efficient computation. As a result, a deep learning practitioner will favor training a neural network model in Python, where these tools are readily available. However, many large-scale scientific computation projects are written in Fortran, making it difficult to integrate with modern deep learning methods. To alleviate this problem, we introduce a software library, the Fortran-Keras Bridge (FKB). This two-way bridge connects environments where deep learning resources are plentiful with those where they are scarce. The paper describes several unique features offered by FKB, such as customizable layers, loss functions, and network ensembles. The paper concludes with a case study that applies FKB to address open questions about the robustness of an experimental approach to global climate simulation, in which subgrid physics are outsourced to deep neural network emulators. In this context, FKB enables a hyperparameter search of one hundred plus candidate models of subgrid cloud and radiation physics, initially implemented in Keras, to be transferred and used in Fortran. Such a process allows the model’s emergent behavior to be assessed, i.e., when fit imperfections are coupled to explicit planetary-scale fluid dynamics. The results reveal a previously unrecognized strong relationship between offline validation error and online performance, in which the choice of the optimizer proves unexpectedly critical. This in turn reveals many new neural network architectures that produce considerable improvements in climate model stability including some with reduced error, for an especially challenging training dataset.</description><subject>Application programming interface</subject><subject>Artificial neural networks</subject><subject>Chemistry</subject><subject>Climate models</subject><subject>Computational fluid dynamics</subject><subject>Computer architecture</subject><subject>Computer simulation</subject><subject>Deep learning</subject><subject>Dosimetry</subject><subject>Earthquakes</subject><subject>Emulators</subject><subject>Error reduction</subject><subject>Fluid dynamics</subject><subject>FORTRAN</subject><subject>High level languages</subject><subject>Laboratories</subject><subject>Libraries</subject><subject>Machine learning</subject><subject>Mechanics</subject><subject>Molecular physics</subject><subject>Neural networks</subject><subject>Partial differential equations</subject><subject>Physics</subject><subject>Popularity</subject><subject>Programming languages</subject><subject>Simulation</subject><subject>Software</subject><subject>Training</subject><subject>Weather forecasting</subject><issn>1058-9244</issn><issn>1875-919X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>RHX</sourceid><recordid>eNqF0M9LwzAUB_AgCs7pzbMEPGpdXn606cHDnE7FgQcVvJU0eZ0Zrq1ph_jfm9GBR3N5gffhPd6XkFNgVwBKTTjjbKK3D2CPjEBnKskhf9-Pf6Z0knMpD8lR160YAw2Mjcj1lM6b0AdTJ08YTEdvEVu6QBNqXy_pTfBuibRqAn2xHuveV97SWbNuN33sH5ODynx2eLKrY_I2v3udPSSL5_vH2XSRWMnTPoHMcgSZgsHMghAoS2eZFkzLVAglpK40Ks6dytKyRAXOaMzKUjsNPBVOjMn5MLcNzdcGu75YNZtQx5VFPIlJIYXmUV0Oyoam6wJWRRv82oSfAlixDajYBlTsAor8YuAfvnbm2_-nzwaN0WBl_jRnuYjgF-e1bMU</recordid><startdate>2020</startdate><enddate>2020</enddate><creator>Curcic, Milan</creator><creator>Linstead, Erik</creator><creator>Best, Natalie</creator><creator>Pritchard, Mike</creator><creator>Ott, Jordan</creator><creator>Baldi, Pierre</creator><general>Hindawi Publishing Corporation</general><general>Hindawi</general><general>Hindawi Limited</general><scope>ADJCN</scope><scope>AHFXO</scope><scope>RHU</scope><scope>RHW</scope><scope>RHX</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0001-8752-4664</orcidid></search><sort><creationdate>2020</creationdate><title>A Fortran-Keras Deep Learning Bridge for Scientific Computing</title><author>Curcic, Milan ; Linstead, Erik ; Best, Natalie ; Pritchard, Mike ; Ott, Jordan ; Baldi, Pierre</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c426t-17c2e1461ae7c133e4bdc0830846335348f8e522d576bbe51da8e7bb8d81263d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Application programming interface</topic><topic>Artificial neural networks</topic><topic>Chemistry</topic><topic>Climate models</topic><topic>Computational fluid dynamics</topic><topic>Computer architecture</topic><topic>Computer simulation</topic><topic>Deep learning</topic><topic>Dosimetry</topic><topic>Earthquakes</topic><topic>Emulators</topic><topic>Error reduction</topic><topic>Fluid dynamics</topic><topic>FORTRAN</topic><topic>High level languages</topic><topic>Laboratories</topic><topic>Libraries</topic><topic>Machine learning</topic><topic>Mechanics</topic><topic>Molecular physics</topic><topic>Neural networks</topic><topic>Partial differential equations</topic><topic>Physics</topic><topic>Popularity</topic><topic>Programming languages</topic><topic>Simulation</topic><topic>Software</topic><topic>Training</topic><topic>Weather forecasting</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Curcic, Milan</creatorcontrib><creatorcontrib>Linstead, Erik</creatorcontrib><creatorcontrib>Best, Natalie</creatorcontrib><creatorcontrib>Pritchard, Mike</creatorcontrib><creatorcontrib>Ott, Jordan</creatorcontrib><creatorcontrib>Baldi, Pierre</creatorcontrib><collection>الدوريات العلمية والإحصائية - e-Marefa Academic and Statistical Periodicals</collection><collection>معرفة - المحتوى العربي الأكاديمي المتكامل - e-Marefa Academic Complete</collection><collection>Hindawi Publishing Complete</collection><collection>Hindawi Publishing Subscription Journals</collection><collection>Hindawi Publishing Open Access</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Scientific programming</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Curcic, Milan</au><au>Linstead, Erik</au><au>Best, Natalie</au><au>Pritchard, Mike</au><au>Ott, Jordan</au><au>Baldi, Pierre</au><au>Acacio Sanchez, Manuel E.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Fortran-Keras Deep Learning Bridge for Scientific Computing</atitle><jtitle>Scientific programming</jtitle><date>2020</date><risdate>2020</risdate><volume>2020</volume><issue>2020</issue><spage>1</spage><epage>13</epage><pages>1-13</pages><issn>1058-9244</issn><eissn>1875-919X</eissn><abstract>Implementing artificial neural networks is commonly achieved via high-level programming languages such as Python and easy-to-use deep learning libraries such as Keras. These software libraries come preloaded with a variety of network architectures, provide autodifferentiation, and support GPUs for fast and efficient computation. As a result, a deep learning practitioner will favor training a neural network model in Python, where these tools are readily available. However, many large-scale scientific computation projects are written in Fortran, making it difficult to integrate with modern deep learning methods. To alleviate this problem, we introduce a software library, the Fortran-Keras Bridge (FKB). This two-way bridge connects environments where deep learning resources are plentiful with those where they are scarce. The paper describes several unique features offered by FKB, such as customizable layers, loss functions, and network ensembles. The paper concludes with a case study that applies FKB to address open questions about the robustness of an experimental approach to global climate simulation, in which subgrid physics are outsourced to deep neural network emulators. In this context, FKB enables a hyperparameter search of one hundred plus candidate models of subgrid cloud and radiation physics, initially implemented in Keras, to be transferred and used in Fortran. Such a process allows the model’s emergent behavior to be assessed, i.e., when fit imperfections are coupled to explicit planetary-scale fluid dynamics. The results reveal a previously unrecognized strong relationship between offline validation error and online performance, in which the choice of the optimizer proves unexpectedly critical. This in turn reveals many new neural network architectures that produce considerable improvements in climate model stability including some with reduced error, for an especially challenging training dataset.</abstract><cop>Cairo, Egypt</cop><pub>Hindawi Publishing Corporation</pub><doi>10.1155/2020/8888811</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0001-8752-4664</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1058-9244
ispartof Scientific programming, 2020, Vol.2020 (2020), p.1-13
issn 1058-9244
1875-919X
language eng
recordid cdi_proquest_journals_2440434382
source Wiley Online Library Open Access; EZB-FREE-00999 freely available EZB journals; Alma/SFX Local Collection
subjects Application programming interface
Artificial neural networks
Chemistry
Climate models
Computational fluid dynamics
Computer architecture
Computer simulation
Deep learning
Dosimetry
Earthquakes
Emulators
Error reduction
Fluid dynamics
FORTRAN
High level languages
Laboratories
Libraries
Machine learning
Mechanics
Molecular physics
Neural networks
Partial differential equations
Physics
Popularity
Programming languages
Simulation
Software
Training
Weather forecasting
title A Fortran-Keras Deep Learning Bridge for Scientific Computing
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-06T11%3A26%3A45IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Fortran-Keras%20Deep%20Learning%20Bridge%20for%20Scientific%20Computing&rft.jtitle=Scientific%20programming&rft.au=Curcic,%20Milan&rft.date=2020&rft.volume=2020&rft.issue=2020&rft.spage=1&rft.epage=13&rft.pages=1-13&rft.issn=1058-9244&rft.eissn=1875-919X&rft_id=info:doi/10.1155/2020/8888811&rft_dat=%3Cproquest_cross%3E2440434382%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2440434382&rft_id=info:pmid/&rfr_iscdi=true