Symplectic learning for Hamiltonian neural networks
Machine learning methods are widely used in the natural sciences to model and predict physical systems from observation data. Yet, they are often used as poorly understood “black boxes,” disregarding existing mathematical structure and invariants of the problem. Recently, the proposal of Hamiltonian...
Gespeichert in:
Veröffentlicht in: | Journal of computational physics 2023-12, Vol.494, p.112495, Article 112495 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | 112495 |
container_title | Journal of computational physics |
container_volume | 494 |
creator | David, Marco Méhats, Florian |
description | Machine learning methods are widely used in the natural sciences to model and predict physical systems from observation data. Yet, they are often used as poorly understood “black boxes,” disregarding existing mathematical structure and invariants of the problem. Recently, the proposal of Hamiltonian Neural Networks (HNNs) took a first step towards a unified “gray box” approach, using physical insight to improve performance for Hamiltonian systems. In this paper, we explore a significantly improved training method for HNNs, exploiting the symplectic structure of Hamiltonian systems with a different loss function. This frees the loss from an artificial lower bound. We mathematically guarantee the existence of an exact Hamiltonian function which the HNN can learn. This allows us to prove and numerically analyze the errors made by HNNs which, in turn, renders them fully explainable. Finally, we present a novel post-training correction to obtain the true Hamiltonian only from discretized observation data, up to an arbitrary order.
•The loss of Hamiltonian Neural Networks suffers from an artificial lower bound.•Using a symplectic integrator during training significantly improves performance.•Rigorous mathematical analysis yields an exact handle on the discretization error.•The learnt Hamiltonian can be corrected post-training, to any arbitrary order. |
doi_str_mv | 10.1016/j.jcp.2023.112495 |
format | Article |
fullrecord | <record><control><sourceid>elsevier_hal_p</sourceid><recordid>TN_cdi_hal_primary_oai_HAL_hal_04500505v1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0021999123005909</els_id><sourcerecordid>S0021999123005909</sourcerecordid><originalsourceid>FETCH-LOGICAL-c283t-21f933e2c6a07ab36983970a17d70c2303db3cdd977aa536ef911dc85ade77403</originalsourceid><addsrcrecordid>eNp9kEFLxDAQhYMouK7-AG-9emidSdqmwdOyqCsseFDPYTZJNbXbLkld2X9vl4pHTw-G9z2Yj7FrhAwBy9sma8wu48BFhshzVZywGYKClEssT9kMgGOqlMJzdhFjAwBVkVczJl4O213rzOBN0joKne_ek7oPyYq2vh36zlOXdO4rUDvG8N2Hz3jJzmpqo7v6zTl7e7h_Xa7S9fPj03KxTg2vxJByrJUQjpuSQNJGlKoSSgKhtBIMFyDsRhhrlZREhShdrRCtqQqyTsocxJzdTLsf1Opd8FsKB92T16vFWh9vkBcABRR7HLs4dU3oYwyu_gMQ9NGQbvRoSB8N6cnQyNxNjBuf2HsXdDTedcZZH0Yj2vb-H_oHA55s8w</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Symplectic learning for Hamiltonian neural networks</title><source>Elsevier ScienceDirect Journals</source><creator>David, Marco ; Méhats, Florian</creator><creatorcontrib>David, Marco ; Méhats, Florian</creatorcontrib><description>Machine learning methods are widely used in the natural sciences to model and predict physical systems from observation data. Yet, they are often used as poorly understood “black boxes,” disregarding existing mathematical structure and invariants of the problem. Recently, the proposal of Hamiltonian Neural Networks (HNNs) took a first step towards a unified “gray box” approach, using physical insight to improve performance for Hamiltonian systems. In this paper, we explore a significantly improved training method for HNNs, exploiting the symplectic structure of Hamiltonian systems with a different loss function. This frees the loss from an artificial lower bound. We mathematically guarantee the existence of an exact Hamiltonian function which the HNN can learn. This allows us to prove and numerically analyze the errors made by HNNs which, in turn, renders them fully explainable. Finally, we present a novel post-training correction to obtain the true Hamiltonian only from discretized observation data, up to an arbitrary order.
•The loss of Hamiltonian Neural Networks suffers from an artificial lower bound.•Using a symplectic integrator during training significantly improves performance.•Rigorous mathematical analysis yields an exact handle on the discretization error.•The learnt Hamiltonian can be corrected post-training, to any arbitrary order.</description><identifier>ISSN: 0021-9991</identifier><identifier>EISSN: 1090-2716</identifier><identifier>DOI: 10.1016/j.jcp.2023.112495</identifier><language>eng</language><publisher>Elsevier Inc</publisher><subject>Geometric numerical integration ; Hamiltonian neural network ; Hamiltonian system ; Mathematics ; Ordinary differential equation ; Symplectic numerical method</subject><ispartof>Journal of computational physics, 2023-12, Vol.494, p.112495, Article 112495</ispartof><rights>2023 Elsevier Inc.</rights><rights>Distributed under a Creative Commons Attribution 4.0 International License</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c283t-21f933e2c6a07ab36983970a17d70c2303db3cdd977aa536ef911dc85ade77403</cites><orcidid>0000-0001-5673-5389</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0021999123005909$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>230,314,776,780,881,3537,27901,27902,65306</link.rule.ids><backlink>$$Uhttps://hal.science/hal-04500505$$DView record in HAL$$Hfree_for_read</backlink></links><search><creatorcontrib>David, Marco</creatorcontrib><creatorcontrib>Méhats, Florian</creatorcontrib><title>Symplectic learning for Hamiltonian neural networks</title><title>Journal of computational physics</title><description>Machine learning methods are widely used in the natural sciences to model and predict physical systems from observation data. Yet, they are often used as poorly understood “black boxes,” disregarding existing mathematical structure and invariants of the problem. Recently, the proposal of Hamiltonian Neural Networks (HNNs) took a first step towards a unified “gray box” approach, using physical insight to improve performance for Hamiltonian systems. In this paper, we explore a significantly improved training method for HNNs, exploiting the symplectic structure of Hamiltonian systems with a different loss function. This frees the loss from an artificial lower bound. We mathematically guarantee the existence of an exact Hamiltonian function which the HNN can learn. This allows us to prove and numerically analyze the errors made by HNNs which, in turn, renders them fully explainable. Finally, we present a novel post-training correction to obtain the true Hamiltonian only from discretized observation data, up to an arbitrary order.
•The loss of Hamiltonian Neural Networks suffers from an artificial lower bound.•Using a symplectic integrator during training significantly improves performance.•Rigorous mathematical analysis yields an exact handle on the discretization error.•The learnt Hamiltonian can be corrected post-training, to any arbitrary order.</description><subject>Geometric numerical integration</subject><subject>Hamiltonian neural network</subject><subject>Hamiltonian system</subject><subject>Mathematics</subject><subject>Ordinary differential equation</subject><subject>Symplectic numerical method</subject><issn>0021-9991</issn><issn>1090-2716</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNp9kEFLxDAQhYMouK7-AG-9emidSdqmwdOyqCsseFDPYTZJNbXbLkld2X9vl4pHTw-G9z2Yj7FrhAwBy9sma8wu48BFhshzVZywGYKClEssT9kMgGOqlMJzdhFjAwBVkVczJl4O213rzOBN0joKne_ek7oPyYq2vh36zlOXdO4rUDvG8N2Hz3jJzmpqo7v6zTl7e7h_Xa7S9fPj03KxTg2vxJByrJUQjpuSQNJGlKoSSgKhtBIMFyDsRhhrlZREhShdrRCtqQqyTsocxJzdTLsf1Opd8FsKB92T16vFWh9vkBcABRR7HLs4dU3oYwyu_gMQ9NGQbvRoSB8N6cnQyNxNjBuf2HsXdDTedcZZH0Yj2vb-H_oHA55s8w</recordid><startdate>20231201</startdate><enddate>20231201</enddate><creator>David, Marco</creator><creator>Méhats, Florian</creator><general>Elsevier Inc</general><general>Elsevier</general><scope>AAYXX</scope><scope>CITATION</scope><scope>1XC</scope><orcidid>https://orcid.org/0000-0001-5673-5389</orcidid></search><sort><creationdate>20231201</creationdate><title>Symplectic learning for Hamiltonian neural networks</title><author>David, Marco ; Méhats, Florian</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c283t-21f933e2c6a07ab36983970a17d70c2303db3cdd977aa536ef911dc85ade77403</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Geometric numerical integration</topic><topic>Hamiltonian neural network</topic><topic>Hamiltonian system</topic><topic>Mathematics</topic><topic>Ordinary differential equation</topic><topic>Symplectic numerical method</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>David, Marco</creatorcontrib><creatorcontrib>Méhats, Florian</creatorcontrib><collection>CrossRef</collection><collection>Hyper Article en Ligne (HAL)</collection><jtitle>Journal of computational physics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>David, Marco</au><au>Méhats, Florian</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Symplectic learning for Hamiltonian neural networks</atitle><jtitle>Journal of computational physics</jtitle><date>2023-12-01</date><risdate>2023</risdate><volume>494</volume><spage>112495</spage><pages>112495-</pages><artnum>112495</artnum><issn>0021-9991</issn><eissn>1090-2716</eissn><abstract>Machine learning methods are widely used in the natural sciences to model and predict physical systems from observation data. Yet, they are often used as poorly understood “black boxes,” disregarding existing mathematical structure and invariants of the problem. Recently, the proposal of Hamiltonian Neural Networks (HNNs) took a first step towards a unified “gray box” approach, using physical insight to improve performance for Hamiltonian systems. In this paper, we explore a significantly improved training method for HNNs, exploiting the symplectic structure of Hamiltonian systems with a different loss function. This frees the loss from an artificial lower bound. We mathematically guarantee the existence of an exact Hamiltonian function which the HNN can learn. This allows us to prove and numerically analyze the errors made by HNNs which, in turn, renders them fully explainable. Finally, we present a novel post-training correction to obtain the true Hamiltonian only from discretized observation data, up to an arbitrary order.
•The loss of Hamiltonian Neural Networks suffers from an artificial lower bound.•Using a symplectic integrator during training significantly improves performance.•Rigorous mathematical analysis yields an exact handle on the discretization error.•The learnt Hamiltonian can be corrected post-training, to any arbitrary order.</abstract><pub>Elsevier Inc</pub><doi>10.1016/j.jcp.2023.112495</doi><orcidid>https://orcid.org/0000-0001-5673-5389</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0021-9991 |
ispartof | Journal of computational physics, 2023-12, Vol.494, p.112495, Article 112495 |
issn | 0021-9991 1090-2716 |
language | eng |
recordid | cdi_hal_primary_oai_HAL_hal_04500505v1 |
source | Elsevier ScienceDirect Journals |
subjects | Geometric numerical integration Hamiltonian neural network Hamiltonian system Mathematics Ordinary differential equation Symplectic numerical method |
title | Symplectic learning for Hamiltonian neural networks |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-08T15%3A01%3A38IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-elsevier_hal_p&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Symplectic%20learning%20for%20Hamiltonian%20neural%20networks&rft.jtitle=Journal%20of%20computational%20physics&rft.au=David,%20Marco&rft.date=2023-12-01&rft.volume=494&rft.spage=112495&rft.pages=112495-&rft.artnum=112495&rft.issn=0021-9991&rft.eissn=1090-2716&rft_id=info:doi/10.1016/j.jcp.2023.112495&rft_dat=%3Celsevier_hal_p%3ES0021999123005909%3C/elsevier_hal_p%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_els_id=S0021999123005909&rfr_iscdi=true |