NIM: Generative Neural Networks for Automated Modeling and Generation of Simulation Inputs

Fitting stochastic input-process models to data and then sampling from them are key steps in a simulation study but highly challenging to non-experts. We present Neural Input Modeling (NIM), a Generative Neural Network (GNN) framework that exploits modern data-rich environments to automatically capt...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:ACM transactions on modeling and computer simulation 2023-08, Vol.33 (3), p.1-26, Article 10
Hauptverfasser: Cen, Wang, Haas, Peter J.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 26
container_issue 3
container_start_page 1
container_title ACM transactions on modeling and computer simulation
container_volume 33
creator Cen, Wang
Haas, Peter J.
description Fitting stochastic input-process models to data and then sampling from them are key steps in a simulation study but highly challenging to non-experts. We present Neural Input Modeling (NIM), a Generative Neural Network (GNN) framework that exploits modern data-rich environments to automatically capture simulation input processes and then generate samples from them. The basic GNN that we develop, called NIM-VL, comprises (i) a variational autoencoder architecture that learns the probability distribution of the input data while avoiding overfitting and (ii) long short-term memory components that concisely capture statistical dependencies across time. We show how the basic GNN architecture can be modified to exploit known distributional properties—such as independent and identically distributed structure, nonnegativity, and multimodality—to increase accuracy and speed, as well as to handle multivariate processes, categorical-valued processes, and extrapolation beyond the training data for certain nonstationary processes. We also introduce an extension to NIM called Conditional Neural Input Modeling (CNIM), which can learn from training data obtained under various realizations of a (possibly time series valued) stochastic “condition,” such as temperature or inflation rate, and then generate sample paths given a value of the condition not seen in the training data. This enables users to simulate a system under a specific working condition by customizing a pre-trained model; CNIM also facilitates what-if analysis. Extensive experiments show the efficacy of our approach. NIM can thus help overcome one of the key barriers to simulation for non-experts.
doi_str_mv 10.1145/3592790
format Article
fullrecord <record><control><sourceid>acm_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1145_3592790</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3592790</sourcerecordid><originalsourceid>FETCH-LOGICAL-a239t-2b992c540ed1f8766ad81059c15ea37e1344d08e3543e241342f9f00d8a0f8183</originalsourceid><addsrcrecordid>eNo9kL1PwzAUxC0EEqUgdiZvTIH3_NHYbFVFS6W2DMDCEpnYRoEkruwExH9PUEune6f3uxuOkEuEG0Qhb7nULNdwREYopcoQtTwebhA64xzwlJyl9AGAHBgbkdfNcn1HF6510XTVl6Mb10dTD9J9h_iZqA-RTvsuNKZzlq6DdXXVvlPT2kMqtDR4-lQ1fb1zy3bbd-mcnHhTJ3ex1zF5md8_zx6y1eNiOZuuMsO47jL2pjUrpQBn0at8MjFWIUhdonSG5w65EBaU41Jwx8RgmdcewCoDXqHiY3K96y1jSCk6X2xj1Zj4UyAUf5MU-0kG8mpHmrI5QP_PX4ReWoo</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>NIM: Generative Neural Networks for Automated Modeling and Generation of Simulation Inputs</title><source>ACM Digital Library Complete</source><creator>Cen, Wang ; Haas, Peter J.</creator><creatorcontrib>Cen, Wang ; Haas, Peter J.</creatorcontrib><description>Fitting stochastic input-process models to data and then sampling from them are key steps in a simulation study but highly challenging to non-experts. We present Neural Input Modeling (NIM), a Generative Neural Network (GNN) framework that exploits modern data-rich environments to automatically capture simulation input processes and then generate samples from them. The basic GNN that we develop, called NIM-VL, comprises (i) a variational autoencoder architecture that learns the probability distribution of the input data while avoiding overfitting and (ii) long short-term memory components that concisely capture statistical dependencies across time. We show how the basic GNN architecture can be modified to exploit known distributional properties—such as independent and identically distributed structure, nonnegativity, and multimodality—to increase accuracy and speed, as well as to handle multivariate processes, categorical-valued processes, and extrapolation beyond the training data for certain nonstationary processes. We also introduce an extension to NIM called Conditional Neural Input Modeling (CNIM), which can learn from training data obtained under various realizations of a (possibly time series valued) stochastic “condition,” such as temperature or inflation rate, and then generate sample paths given a value of the condition not seen in the training data. This enables users to simulate a system under a specific working condition by customizing a pre-trained model; CNIM also facilitates what-if analysis. Extensive experiments show the efficacy of our approach. NIM can thus help overcome one of the key barriers to simulation for non-experts.</description><identifier>ISSN: 1049-3301</identifier><identifier>EISSN: 1558-1195</identifier><identifier>DOI: 10.1145/3592790</identifier><language>eng</language><publisher>New York, NY, USA: ACM</publisher><subject>Computing methodologies ; Modeling methodologies</subject><ispartof>ACM transactions on modeling and computer simulation, 2023-08, Vol.33 (3), p.1-26, Article 10</ispartof><rights>Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-a239t-2b992c540ed1f8766ad81059c15ea37e1344d08e3543e241342f9f00d8a0f8183</cites><orcidid>0000-0001-5694-3065 ; 0009-0009-9402-5949</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://dl.acm.org/doi/pdf/10.1145/3592790$$EPDF$$P50$$Gacm$$H</linktopdf><link.rule.ids>314,777,781,2276,27905,27906,40177,75977</link.rule.ids></links><search><creatorcontrib>Cen, Wang</creatorcontrib><creatorcontrib>Haas, Peter J.</creatorcontrib><title>NIM: Generative Neural Networks for Automated Modeling and Generation of Simulation Inputs</title><title>ACM transactions on modeling and computer simulation</title><addtitle>ACM TOMACS</addtitle><description>Fitting stochastic input-process models to data and then sampling from them are key steps in a simulation study but highly challenging to non-experts. We present Neural Input Modeling (NIM), a Generative Neural Network (GNN) framework that exploits modern data-rich environments to automatically capture simulation input processes and then generate samples from them. The basic GNN that we develop, called NIM-VL, comprises (i) a variational autoencoder architecture that learns the probability distribution of the input data while avoiding overfitting and (ii) long short-term memory components that concisely capture statistical dependencies across time. We show how the basic GNN architecture can be modified to exploit known distributional properties—such as independent and identically distributed structure, nonnegativity, and multimodality—to increase accuracy and speed, as well as to handle multivariate processes, categorical-valued processes, and extrapolation beyond the training data for certain nonstationary processes. We also introduce an extension to NIM called Conditional Neural Input Modeling (CNIM), which can learn from training data obtained under various realizations of a (possibly time series valued) stochastic “condition,” such as temperature or inflation rate, and then generate sample paths given a value of the condition not seen in the training data. This enables users to simulate a system under a specific working condition by customizing a pre-trained model; CNIM also facilitates what-if analysis. Extensive experiments show the efficacy of our approach. NIM can thus help overcome one of the key barriers to simulation for non-experts.</description><subject>Computing methodologies</subject><subject>Modeling methodologies</subject><issn>1049-3301</issn><issn>1558-1195</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNo9kL1PwzAUxC0EEqUgdiZvTIH3_NHYbFVFS6W2DMDCEpnYRoEkruwExH9PUEune6f3uxuOkEuEG0Qhb7nULNdwREYopcoQtTwebhA64xzwlJyl9AGAHBgbkdfNcn1HF6510XTVl6Mb10dTD9J9h_iZqA-RTvsuNKZzlq6DdXXVvlPT2kMqtDR4-lQ1fb1zy3bbd-mcnHhTJ3ex1zF5md8_zx6y1eNiOZuuMsO47jL2pjUrpQBn0at8MjFWIUhdonSG5w65EBaU41Jwx8RgmdcewCoDXqHiY3K96y1jSCk6X2xj1Zj4UyAUf5MU-0kG8mpHmrI5QP_PX4ReWoo</recordid><startdate>20230810</startdate><enddate>20230810</enddate><creator>Cen, Wang</creator><creator>Haas, Peter J.</creator><general>ACM</general><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0001-5694-3065</orcidid><orcidid>https://orcid.org/0009-0009-9402-5949</orcidid></search><sort><creationdate>20230810</creationdate><title>NIM: Generative Neural Networks for Automated Modeling and Generation of Simulation Inputs</title><author>Cen, Wang ; Haas, Peter J.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a239t-2b992c540ed1f8766ad81059c15ea37e1344d08e3543e241342f9f00d8a0f8183</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Computing methodologies</topic><topic>Modeling methodologies</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Cen, Wang</creatorcontrib><creatorcontrib>Haas, Peter J.</creatorcontrib><collection>CrossRef</collection><jtitle>ACM transactions on modeling and computer simulation</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Cen, Wang</au><au>Haas, Peter J.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>NIM: Generative Neural Networks for Automated Modeling and Generation of Simulation Inputs</atitle><jtitle>ACM transactions on modeling and computer simulation</jtitle><stitle>ACM TOMACS</stitle><date>2023-08-10</date><risdate>2023</risdate><volume>33</volume><issue>3</issue><spage>1</spage><epage>26</epage><pages>1-26</pages><artnum>10</artnum><issn>1049-3301</issn><eissn>1558-1195</eissn><abstract>Fitting stochastic input-process models to data and then sampling from them are key steps in a simulation study but highly challenging to non-experts. We present Neural Input Modeling (NIM), a Generative Neural Network (GNN) framework that exploits modern data-rich environments to automatically capture simulation input processes and then generate samples from them. The basic GNN that we develop, called NIM-VL, comprises (i) a variational autoencoder architecture that learns the probability distribution of the input data while avoiding overfitting and (ii) long short-term memory components that concisely capture statistical dependencies across time. We show how the basic GNN architecture can be modified to exploit known distributional properties—such as independent and identically distributed structure, nonnegativity, and multimodality—to increase accuracy and speed, as well as to handle multivariate processes, categorical-valued processes, and extrapolation beyond the training data for certain nonstationary processes. We also introduce an extension to NIM called Conditional Neural Input Modeling (CNIM), which can learn from training data obtained under various realizations of a (possibly time series valued) stochastic “condition,” such as temperature or inflation rate, and then generate sample paths given a value of the condition not seen in the training data. This enables users to simulate a system under a specific working condition by customizing a pre-trained model; CNIM also facilitates what-if analysis. Extensive experiments show the efficacy of our approach. NIM can thus help overcome one of the key barriers to simulation for non-experts.</abstract><cop>New York, NY, USA</cop><pub>ACM</pub><doi>10.1145/3592790</doi><tpages>26</tpages><orcidid>https://orcid.org/0000-0001-5694-3065</orcidid><orcidid>https://orcid.org/0009-0009-9402-5949</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1049-3301
ispartof ACM transactions on modeling and computer simulation, 2023-08, Vol.33 (3), p.1-26, Article 10
issn 1049-3301
1558-1195
language eng
recordid cdi_crossref_primary_10_1145_3592790
source ACM Digital Library Complete
subjects Computing methodologies
Modeling methodologies
title NIM: Generative Neural Networks for Automated Modeling and Generation of Simulation Inputs
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-21T09%3A04%3A52IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-acm_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=NIM:%20Generative%20Neural%20Networks%20for%20Automated%20Modeling%20and%20Generation%20of%20Simulation%20Inputs&rft.jtitle=ACM%20transactions%20on%20modeling%20and%20computer%20simulation&rft.au=Cen,%20Wang&rft.date=2023-08-10&rft.volume=33&rft.issue=3&rft.spage=1&rft.epage=26&rft.pages=1-26&rft.artnum=10&rft.issn=1049-3301&rft.eissn=1558-1195&rft_id=info:doi/10.1145/3592790&rft_dat=%3Cacm_cross%3E3592790%3C/acm_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true