Continuous self‐adversarial training of recurrent neural network–based constitutive description

Data‐driven methods yield advantages in computational homogenization approaches due to the ability to capture complex material behaviour without the necessity to assume specific constitutive models. Neural network–based constitutive descriptions are one of the most widely used data‐driven approaches...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Proceedings in applied mathematics and mechanics 2023-11, Vol.23 (3), p.n/a
Hauptverfasser: Khedkar, Abhinav Anil, Stöcker, Julien Philipp, Zschocke, Selina, Kaliske, Michael
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page n/a
container_issue 3
container_start_page
container_title Proceedings in applied mathematics and mechanics
container_volume 23
creator Khedkar, Abhinav Anil
Stöcker, Julien Philipp
Zschocke, Selina
Kaliske, Michael
description Data‐driven methods yield advantages in computational homogenization approaches due to the ability to capture complex material behaviour without the necessity to assume specific constitutive models. Neural network–based constitutive descriptions are one of the most widely used data‐driven approaches in the context of computational mechanics. The accuracy of this method strongly depends on the available data. Additionally, when considering inelastic materials, whose constitutive responses depend on the loading history, the accuracy and robustness of the approximation are influenced by the training algorithm. The applied recurrent neural networks exhibit reduced robustness in the presence of errors in the input. When capturing the history dependency using previously predicted material responses, occurrences of prediction errors accumulate over several time steps. An approach for achieving enhanced robustness of the predictions is based on extending the initial training dataset by iteratively generating adversarial examples, subjected to perturbations, based on the current prediction errors. In this contribution, a continuous self‐adversarial training approach yielding robust recurrent neural network constitutive descriptions for inelastic materials is presented. Compared to the iterative method it is based on, it exhibits significantly improved training efficiency. In order to demonstrate the capabilities of the proposed methods, numerical examples with datasets obtained by numerical material tests on representative volume elements are carried out. Validation of the results is performed using both test load cases from the numerical dataset, as well as application as a constitutive model in the finite element method.
doi_str_mv 10.1002/pamm.202300111
format Article
fullrecord <record><control><sourceid>wiley_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1002_pamm_202300111</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>PAMM202300111</sourcerecordid><originalsourceid>FETCH-LOGICAL-c1741-ecba025e1a37a15f1c943867300918b7bf089befd0d51d175bfdde7deb9a30153</originalsourceid><addsrcrecordid>eNqFkMtKAzEUhoMoWKtb13mBqTkznUlnWYo3aNGFrodcTiQ6TUqSaemujyD4hn0SpyjqztX54fzfgfMRcglsBIzlVyuxXI5ylheMAcARGUAFPOOsguM_-ZScxfja96Eq2IComXfJus53kUZszX73LvQaQxTBipamIKyz7oV6QwOqLgR0iTrsQr90mDY-vO13H1JE1FR5F5NNXbJrpBqjCnaVrHfn5MSINuLF9xyS55vrp9ldNn-4vZ9N55kCPoYMlRQsLxFEwQWUBlQ9LiYV7_-pYSK5NGxSSzSa6RI08FIarZFrlLUoGJTFkIy-7qrgYwxomlWwSxG2DbDmoKg5KGp-FPVA_QVsbIvbf9rN43Sx-GU_AaKncGs</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Continuous self‐adversarial training of recurrent neural network–based constitutive description</title><source>Wiley Online Library Journals【Remote access available】</source><creator>Khedkar, Abhinav Anil ; Stöcker, Julien Philipp ; Zschocke, Selina ; Kaliske, Michael</creator><creatorcontrib>Khedkar, Abhinav Anil ; Stöcker, Julien Philipp ; Zschocke, Selina ; Kaliske, Michael</creatorcontrib><description>Data‐driven methods yield advantages in computational homogenization approaches due to the ability to capture complex material behaviour without the necessity to assume specific constitutive models. Neural network–based constitutive descriptions are one of the most widely used data‐driven approaches in the context of computational mechanics. The accuracy of this method strongly depends on the available data. Additionally, when considering inelastic materials, whose constitutive responses depend on the loading history, the accuracy and robustness of the approximation are influenced by the training algorithm. The applied recurrent neural networks exhibit reduced robustness in the presence of errors in the input. When capturing the history dependency using previously predicted material responses, occurrences of prediction errors accumulate over several time steps. An approach for achieving enhanced robustness of the predictions is based on extending the initial training dataset by iteratively generating adversarial examples, subjected to perturbations, based on the current prediction errors. In this contribution, a continuous self‐adversarial training approach yielding robust recurrent neural network constitutive descriptions for inelastic materials is presented. Compared to the iterative method it is based on, it exhibits significantly improved training efficiency. In order to demonstrate the capabilities of the proposed methods, numerical examples with datasets obtained by numerical material tests on representative volume elements are carried out. Validation of the results is performed using both test load cases from the numerical dataset, as well as application as a constitutive model in the finite element method.</description><identifier>ISSN: 1617-7061</identifier><identifier>EISSN: 1617-7061</identifier><identifier>DOI: 10.1002/pamm.202300111</identifier><language>eng</language><ispartof>Proceedings in applied mathematics and mechanics, 2023-11, Vol.23 (3), p.n/a</ispartof><rights>2023 The Authors. published by Wiley‐VCH GmbH.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c1741-ecba025e1a37a15f1c943867300918b7bf089befd0d51d175bfdde7deb9a30153</citedby><cites>FETCH-LOGICAL-c1741-ecba025e1a37a15f1c943867300918b7bf089befd0d51d175bfdde7deb9a30153</cites><orcidid>0000-0002-3290-9740 ; 0000-0003-4423-9577 ; 0009-0000-8808-5755 ; 0000-0003-2791-2280</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1002%2Fpamm.202300111$$EPDF$$P50$$Gwiley$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1002%2Fpamm.202300111$$EHTML$$P50$$Gwiley$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,1417,27924,27925,45574,45575</link.rule.ids></links><search><creatorcontrib>Khedkar, Abhinav Anil</creatorcontrib><creatorcontrib>Stöcker, Julien Philipp</creatorcontrib><creatorcontrib>Zschocke, Selina</creatorcontrib><creatorcontrib>Kaliske, Michael</creatorcontrib><title>Continuous self‐adversarial training of recurrent neural network–based constitutive description</title><title>Proceedings in applied mathematics and mechanics</title><description>Data‐driven methods yield advantages in computational homogenization approaches due to the ability to capture complex material behaviour without the necessity to assume specific constitutive models. Neural network–based constitutive descriptions are one of the most widely used data‐driven approaches in the context of computational mechanics. The accuracy of this method strongly depends on the available data. Additionally, when considering inelastic materials, whose constitutive responses depend on the loading history, the accuracy and robustness of the approximation are influenced by the training algorithm. The applied recurrent neural networks exhibit reduced robustness in the presence of errors in the input. When capturing the history dependency using previously predicted material responses, occurrences of prediction errors accumulate over several time steps. An approach for achieving enhanced robustness of the predictions is based on extending the initial training dataset by iteratively generating adversarial examples, subjected to perturbations, based on the current prediction errors. In this contribution, a continuous self‐adversarial training approach yielding robust recurrent neural network constitutive descriptions for inelastic materials is presented. Compared to the iterative method it is based on, it exhibits significantly improved training efficiency. In order to demonstrate the capabilities of the proposed methods, numerical examples with datasets obtained by numerical material tests on representative volume elements are carried out. Validation of the results is performed using both test load cases from the numerical dataset, as well as application as a constitutive model in the finite element method.</description><issn>1617-7061</issn><issn>1617-7061</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>24P</sourceid><sourceid>WIN</sourceid><recordid>eNqFkMtKAzEUhoMoWKtb13mBqTkznUlnWYo3aNGFrodcTiQ6TUqSaemujyD4hn0SpyjqztX54fzfgfMRcglsBIzlVyuxXI5ylheMAcARGUAFPOOsguM_-ZScxfja96Eq2IComXfJus53kUZszX73LvQaQxTBipamIKyz7oV6QwOqLgR0iTrsQr90mDY-vO13H1JE1FR5F5NNXbJrpBqjCnaVrHfn5MSINuLF9xyS55vrp9ldNn-4vZ9N55kCPoYMlRQsLxFEwQWUBlQ9LiYV7_-pYSK5NGxSSzSa6RI08FIarZFrlLUoGJTFkIy-7qrgYwxomlWwSxG2DbDmoKg5KGp-FPVA_QVsbIvbf9rN43Sx-GU_AaKncGs</recordid><startdate>202311</startdate><enddate>202311</enddate><creator>Khedkar, Abhinav Anil</creator><creator>Stöcker, Julien Philipp</creator><creator>Zschocke, Selina</creator><creator>Kaliske, Michael</creator><scope>24P</scope><scope>WIN</scope><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0002-3290-9740</orcidid><orcidid>https://orcid.org/0000-0003-4423-9577</orcidid><orcidid>https://orcid.org/0009-0000-8808-5755</orcidid><orcidid>https://orcid.org/0000-0003-2791-2280</orcidid></search><sort><creationdate>202311</creationdate><title>Continuous self‐adversarial training of recurrent neural network–based constitutive description</title><author>Khedkar, Abhinav Anil ; Stöcker, Julien Philipp ; Zschocke, Selina ; Kaliske, Michael</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c1741-ecba025e1a37a15f1c943867300918b7bf089befd0d51d175bfdde7deb9a30153</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><toplevel>online_resources</toplevel><creatorcontrib>Khedkar, Abhinav Anil</creatorcontrib><creatorcontrib>Stöcker, Julien Philipp</creatorcontrib><creatorcontrib>Zschocke, Selina</creatorcontrib><creatorcontrib>Kaliske, Michael</creatorcontrib><collection>Wiley Online Library Open Access</collection><collection>Wiley Online Library (Open Access Collection)</collection><collection>CrossRef</collection><jtitle>Proceedings in applied mathematics and mechanics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Khedkar, Abhinav Anil</au><au>Stöcker, Julien Philipp</au><au>Zschocke, Selina</au><au>Kaliske, Michael</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Continuous self‐adversarial training of recurrent neural network–based constitutive description</atitle><jtitle>Proceedings in applied mathematics and mechanics</jtitle><date>2023-11</date><risdate>2023</risdate><volume>23</volume><issue>3</issue><epage>n/a</epage><issn>1617-7061</issn><eissn>1617-7061</eissn><abstract>Data‐driven methods yield advantages in computational homogenization approaches due to the ability to capture complex material behaviour without the necessity to assume specific constitutive models. Neural network–based constitutive descriptions are one of the most widely used data‐driven approaches in the context of computational mechanics. The accuracy of this method strongly depends on the available data. Additionally, when considering inelastic materials, whose constitutive responses depend on the loading history, the accuracy and robustness of the approximation are influenced by the training algorithm. The applied recurrent neural networks exhibit reduced robustness in the presence of errors in the input. When capturing the history dependency using previously predicted material responses, occurrences of prediction errors accumulate over several time steps. An approach for achieving enhanced robustness of the predictions is based on extending the initial training dataset by iteratively generating adversarial examples, subjected to perturbations, based on the current prediction errors. In this contribution, a continuous self‐adversarial training approach yielding robust recurrent neural network constitutive descriptions for inelastic materials is presented. Compared to the iterative method it is based on, it exhibits significantly improved training efficiency. In order to demonstrate the capabilities of the proposed methods, numerical examples with datasets obtained by numerical material tests on representative volume elements are carried out. Validation of the results is performed using both test load cases from the numerical dataset, as well as application as a constitutive model in the finite element method.</abstract><doi>10.1002/pamm.202300111</doi><tpages>8</tpages><orcidid>https://orcid.org/0000-0002-3290-9740</orcidid><orcidid>https://orcid.org/0000-0003-4423-9577</orcidid><orcidid>https://orcid.org/0009-0000-8808-5755</orcidid><orcidid>https://orcid.org/0000-0003-2791-2280</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1617-7061
ispartof Proceedings in applied mathematics and mechanics, 2023-11, Vol.23 (3), p.n/a
issn 1617-7061
1617-7061
language eng
recordid cdi_crossref_primary_10_1002_pamm_202300111
source Wiley Online Library Journals【Remote access available】
title Continuous self‐adversarial training of recurrent neural network–based constitutive description
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T06%3A33%3A03IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-wiley_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Continuous%20self%E2%80%90adversarial%20training%20of%20recurrent%20neural%20network%E2%80%93based%20constitutive%20description&rft.jtitle=Proceedings%20in%20applied%20mathematics%20and%20mechanics&rft.au=Khedkar,%20Abhinav%20Anil&rft.date=2023-11&rft.volume=23&rft.issue=3&rft.epage=n/a&rft.issn=1617-7061&rft.eissn=1617-7061&rft_id=info:doi/10.1002/pamm.202300111&rft_dat=%3Cwiley_cross%3EPAMM202300111%3C/wiley_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true