The principle of minimum pressure gradient: An alternative basis for physics-informed learning of incompressible fluid mechanics
Recent advances in the application of physics-informed learning in the field of fluid mechanics have been predominantly grounded in the Newtonian framework, primarily leveraging Navier–Stokes equations or one of their various derivatives to train a neural network. Here, we propose an alternative app...
Gespeichert in:
Veröffentlicht in: | AIP advances 2024-04, Vol.14 (4), p.045112-045112-8 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 045112-8 |
---|---|
container_issue | 4 |
container_start_page | 045112 |
container_title | AIP advances |
container_volume | 14 |
creator | Alhussein, H. Daqaq, M. |
description | Recent advances in the application of physics-informed learning in the field of fluid mechanics have been predominantly grounded in the Newtonian framework, primarily leveraging Navier–Stokes equations or one of their various derivatives to train a neural network. Here, we propose an alternative approach based on variational methods. The proposed approach uses the principle of minimum pressure gradient combined with the continuity constraint to train a neural network and predict the flow field in incompressible fluids. We describe the underlying principles of the proposed approach, then use a demonstrative example to illustrate its implementation, and show that it reduces the computational time per training epoch when compared to the conventional approach. |
doi_str_mv | 10.1063/5.0197860 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1063_5_0197860</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_6ceaed2a8ba64cfeaf32857ef3ce131e</doaj_id><sourcerecordid>3034513675</sourcerecordid><originalsourceid>FETCH-LOGICAL-c353t-4a7e839e55d94ab89ef14de56384206f03aa4bb6d9477503be55a7ff0edb219c3</originalsourceid><addsrcrecordid>eNp9kUtLxDAQgIsoKOrBfxDwpNA1aR5tvYn4AsGLnsM0nexmaZM1aQVv_nSjK-LJXJJMvvmGzBTFCaMLRhW_kAvK2rpRdKc4qJhsSl5VavfPeb84TmlN8xIto404KD6eV0g20XnjNgOSYMnovBvnMQcxpTkiWUboHfrpklx5AsOE0cPk3pB0kFwiNkSyWb0nZ1LpfL6N2JMBIXrnl1_C7A7jt811uYQdZteTEc0KfM45KvYsDAmPf_bD4uX25vn6vnx8unu4vnosDZd8KgXU2PAWpexbAV3TomWiR6l4IyqqLOUAoutUfq1rSXmXSaitpdh3FWsNPywett4-wFrnH48Q33UAp78DIS41xMmZAbUyCNhX0HSghLEIlleNrNFyg4wzzK7TrWsTw-uMadLrMOeuDElzyoVkXNUyU2dbysSQUkT7W5VR_TUvLfXPvDJ7vmWTcVPubvD_wJ-MQJeJ</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3034513675</pqid></control><display><type>article</type><title>The principle of minimum pressure gradient: An alternative basis for physics-informed learning of incompressible fluid mechanics</title><source>DOAJ Directory of Open Access Journals</source><source>EZB-FREE-00999 freely available EZB journals</source><source>Free Full-Text Journals in Chemistry</source><creator>Alhussein, H. ; Daqaq, M.</creator><creatorcontrib>Alhussein, H. ; Daqaq, M.</creatorcontrib><description>Recent advances in the application of physics-informed learning in the field of fluid mechanics have been predominantly grounded in the Newtonian framework, primarily leveraging Navier–Stokes equations or one of their various derivatives to train a neural network. Here, we propose an alternative approach based on variational methods. The proposed approach uses the principle of minimum pressure gradient combined with the continuity constraint to train a neural network and predict the flow field in incompressible fluids. We describe the underlying principles of the proposed approach, then use a demonstrative example to illustrate its implementation, and show that it reduces the computational time per training epoch when compared to the conventional approach.</description><identifier>ISSN: 2158-3226</identifier><identifier>EISSN: 2158-3226</identifier><identifier>DOI: 10.1063/5.0197860</identifier><identifier>CODEN: AAIDBI</identifier><language>eng</language><publisher>Melville: American Institute of Physics</publisher><subject>Computing time ; Fluid flow ; Fluid mechanics ; Incompressible flow ; Incompressible fluids ; Learning ; Neural networks ; Principles ; Variational methods</subject><ispartof>AIP advances, 2024-04, Vol.14 (4), p.045112-045112-8</ispartof><rights>Author(s)</rights><rights>2024 Author(s). All article content, except where otherwise noted, is licensed under a Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c353t-4a7e839e55d94ab89ef14de56384206f03aa4bb6d9477503be55a7ff0edb219c3</cites><orcidid>0000-0002-2004-6482 ; 0000-0003-2464-1059</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,864,2102,27924,27925</link.rule.ids></links><search><creatorcontrib>Alhussein, H.</creatorcontrib><creatorcontrib>Daqaq, M.</creatorcontrib><title>The principle of minimum pressure gradient: An alternative basis for physics-informed learning of incompressible fluid mechanics</title><title>AIP advances</title><description>Recent advances in the application of physics-informed learning in the field of fluid mechanics have been predominantly grounded in the Newtonian framework, primarily leveraging Navier–Stokes equations or one of their various derivatives to train a neural network. Here, we propose an alternative approach based on variational methods. The proposed approach uses the principle of minimum pressure gradient combined with the continuity constraint to train a neural network and predict the flow field in incompressible fluids. We describe the underlying principles of the proposed approach, then use a demonstrative example to illustrate its implementation, and show that it reduces the computational time per training epoch when compared to the conventional approach.</description><subject>Computing time</subject><subject>Fluid flow</subject><subject>Fluid mechanics</subject><subject>Incompressible flow</subject><subject>Incompressible fluids</subject><subject>Learning</subject><subject>Neural networks</subject><subject>Principles</subject><subject>Variational methods</subject><issn>2158-3226</issn><issn>2158-3226</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>DOA</sourceid><recordid>eNp9kUtLxDAQgIsoKOrBfxDwpNA1aR5tvYn4AsGLnsM0nexmaZM1aQVv_nSjK-LJXJJMvvmGzBTFCaMLRhW_kAvK2rpRdKc4qJhsSl5VavfPeb84TmlN8xIto404KD6eV0g20XnjNgOSYMnovBvnMQcxpTkiWUboHfrpklx5AsOE0cPk3pB0kFwiNkSyWb0nZ1LpfL6N2JMBIXrnl1_C7A7jt811uYQdZteTEc0KfM45KvYsDAmPf_bD4uX25vn6vnx8unu4vnosDZd8KgXU2PAWpexbAV3TomWiR6l4IyqqLOUAoutUfq1rSXmXSaitpdh3FWsNPywett4-wFrnH48Q33UAp78DIS41xMmZAbUyCNhX0HSghLEIlleNrNFyg4wzzK7TrWsTw-uMadLrMOeuDElzyoVkXNUyU2dbysSQUkT7W5VR_TUvLfXPvDJ7vmWTcVPubvD_wJ-MQJeJ</recordid><startdate>20240401</startdate><enddate>20240401</enddate><creator>Alhussein, H.</creator><creator>Daqaq, M.</creator><general>American Institute of Physics</general><general>AIP Publishing LLC</general><scope>AJDQP</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>8FD</scope><scope>H8D</scope><scope>L7M</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-2004-6482</orcidid><orcidid>https://orcid.org/0000-0003-2464-1059</orcidid></search><sort><creationdate>20240401</creationdate><title>The principle of minimum pressure gradient: An alternative basis for physics-informed learning of incompressible fluid mechanics</title><author>Alhussein, H. ; Daqaq, M.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c353t-4a7e839e55d94ab89ef14de56384206f03aa4bb6d9477503be55a7ff0edb219c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Computing time</topic><topic>Fluid flow</topic><topic>Fluid mechanics</topic><topic>Incompressible flow</topic><topic>Incompressible fluids</topic><topic>Learning</topic><topic>Neural networks</topic><topic>Principles</topic><topic>Variational methods</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Alhussein, H.</creatorcontrib><creatorcontrib>Daqaq, M.</creatorcontrib><collection>AIP Open Access Journals</collection><collection>CrossRef</collection><collection>Technology Research Database</collection><collection>Aerospace Database</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>AIP advances</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Alhussein, H.</au><au>Daqaq, M.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>The principle of minimum pressure gradient: An alternative basis for physics-informed learning of incompressible fluid mechanics</atitle><jtitle>AIP advances</jtitle><date>2024-04-01</date><risdate>2024</risdate><volume>14</volume><issue>4</issue><spage>045112</spage><epage>045112-8</epage><pages>045112-045112-8</pages><issn>2158-3226</issn><eissn>2158-3226</eissn><coden>AAIDBI</coden><abstract>Recent advances in the application of physics-informed learning in the field of fluid mechanics have been predominantly grounded in the Newtonian framework, primarily leveraging Navier–Stokes equations or one of their various derivatives to train a neural network. Here, we propose an alternative approach based on variational methods. The proposed approach uses the principle of minimum pressure gradient combined with the continuity constraint to train a neural network and predict the flow field in incompressible fluids. We describe the underlying principles of the proposed approach, then use a demonstrative example to illustrate its implementation, and show that it reduces the computational time per training epoch when compared to the conventional approach.</abstract><cop>Melville</cop><pub>American Institute of Physics</pub><doi>10.1063/5.0197860</doi><tpages>8</tpages><orcidid>https://orcid.org/0000-0002-2004-6482</orcidid><orcidid>https://orcid.org/0000-0003-2464-1059</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2158-3226 |
ispartof | AIP advances, 2024-04, Vol.14 (4), p.045112-045112-8 |
issn | 2158-3226 2158-3226 |
language | eng |
recordid | cdi_crossref_primary_10_1063_5_0197860 |
source | DOAJ Directory of Open Access Journals; EZB-FREE-00999 freely available EZB journals; Free Full-Text Journals in Chemistry |
subjects | Computing time Fluid flow Fluid mechanics Incompressible flow Incompressible fluids Learning Neural networks Principles Variational methods |
title | The principle of minimum pressure gradient: An alternative basis for physics-informed learning of incompressible fluid mechanics |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-25T08%3A00%3A14IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=The%20principle%20of%20minimum%20pressure%20gradient:%20An%20alternative%20basis%20for%20physics-informed%20learning%20of%20incompressible%20fluid%20mechanics&rft.jtitle=AIP%20advances&rft.au=Alhussein,%20H.&rft.date=2024-04-01&rft.volume=14&rft.issue=4&rft.spage=045112&rft.epage=045112-8&rft.pages=045112-045112-8&rft.issn=2158-3226&rft.eissn=2158-3226&rft.coden=AAIDBI&rft_id=info:doi/10.1063/5.0197860&rft_dat=%3Cproquest_cross%3E3034513675%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3034513675&rft_id=info:pmid/&rft_doaj_id=oai_doaj_org_article_6ceaed2a8ba64cfeaf32857ef3ce131e&rfr_iscdi=true |