Abstract
Fitting stochastic input-process models to data and then sampling from them are key steps in a simulation study but highly challenging to non-experts. We present Neural Input Modeling (NIM), a Generative Neural Network (GNN) framework that exploits modern data-rich environments to automatically capture simulation input processes and then generate samples from them. The basic GNN that we develop, called NIM-VL, comprises (i) a variational autoencoder architecture that learns the probability distribution of the input data while avoiding overfitting and (ii) long short-term memory components that concisely capture statistical dependencies across time. We show how the basic GNN architecture can be modified to exploit known distributional properties—such as independent and identically distributed structure, nonnegativity, and multimodality—to increase accuracy and speed, as well as to handle multivariate processes, categorical-valued processes, and extrapolation beyond the training data for certain nonstationary processes. We also introduce an extension to NIM called Conditional Neural Input Modeling (CNIM), which can learn from training data obtained under various realizations of a (possibly time series valued) stochastic “condition,” such as temperature or inflation rate, and then generate sample paths given a value of the condition not seen in the training data. This enables users to simulate a system under a specific working condition by customizing a pre-trained model; CNIM also facilitates what-if analysis. Extensive experiments show the efficacy of our approach. NIM can thus help overcome one of the key barriers to simulation for non-experts.
Supplemental Material
Available for Download
Supplementary material
- [1] . 2020. Tutorial—What Is a Variational Autoencoder? Retrieved April 18, 2020 from https://jaan.io/what-is-variational-autoencoder-vae-tutorial.Google Scholar
- [2] . 2009. Copula-based multivariate input models for stochastic simulation. Operations Research 57, 4 (2009), 878–892.Google ScholarDigital Library
- [3] . 2003. Modeling and generating multivariate time-series input processes using a vector autoregressive technique. ACM Transactions on Modeling and Computer Simulation 13, 3 (2003), 211–237.Google ScholarDigital Library
- [4] . 2016. Generating sentences from a continuous space. In Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning. 10–21.Google ScholarCross Ref
- [5] . 2016. Time Series Analysis: Forecasting and Control. Wiley.Google Scholar
- [6] . 1987. A Guide to Simulation. Springer-Verlag.Google ScholarDigital Library
- [7] . 2009. Time Series: Theory and Methods. Springer.Google Scholar
- [8] . 1997. Modeling and Generating Random Vectors with Arbitrary Marginal Distributions and Correlation Matrix.
Technical Report . Department of Industrial Engineering and Management Sciences, Northwestern University, Evanston, IL.Google Scholar - [9] . 2020. NIM: Modeling and generation of simulation inputs via generative neural networks. In Proceedings of the 2020 Winter Simulation Conference (WSC’20). IEEE, Piscataway, NJ, 584–595.Google ScholarCross Ref
- [10] . 1980. Point Processes. Chapman & Hall.Google Scholar
- [11] . 2016. Tutorial on variational autoencoders. arXiv preprint arXiv:1606.05908v2 (2016).Google Scholar
- [12] Geer Mountain Software 2020. Stat::Fit Distribution Fitting Software. Retrieved April 18, 2020 from https://www.geerms.com.Google Scholar
- [13] . 2003. Bootstrap methods for time series. International Statistical Review 71, 2 (2003), 435–459.Google ScholarCross Ref
- [14] . 2009. The Elements of Statistical Learning (2nd ed.). Springer.Google ScholarCross Ref
- [15] . 1997. Long short-term memory. Neural Computation 9, 8 (1997), 1735–1780.Google ScholarDigital Library
- [16] . 2004. Automatic Nonuniform Random Variate Generation. Springer.Google ScholarCross Ref
- [17] . 2021. Variational-LSTM autoencoder to forecast the spread of coronavirus across the globe. PLoS One 16, 1 (2021), e0246120.Google ScholarCross Ref
- [18] . 2018. Better input modeling via model averaging. In Proceedings of the 2018 Winter Simulation Conference (WSC’18). IEEE, Piscataway, NJ, 1575–1586.Google ScholarDigital Library
- [19] . 2021. AutoML to date and beyond: Challenges and opportunities. ACM Computing Surveys 54, 8 (2021), 1–36.Google ScholarDigital Library
- [20] . 2013. Auto-encoding variational Bayes. arXiv preprint arXiv:1312.6114v10 (2013).Google Scholar
- [21] . 2015. Simulation Modeling and Analysis (5th ed.). McGraw-Hill, New York, NY.Google Scholar
- [22] . 1979. Simulation of nonhomogeneous Poisson processes by thinning. Naval Research Logistics Quarterly 26 (1979), 403–413.Google ScholarCross Ref
- [23] . 2015. A critical review of recurrent neural networks for sequence learning. arXiv preprint arXiv:1506.00019v4 (2015).Google Scholar
- [24] . 1981. Matrix-Geometric Solutions in Stochastic Models. Dover.Google Scholar
- [25] . 2018. A survey on open information extraction. In Proceedings of the 27th International Conference on Computational Linguistics (COLING’18). 3866–3878.Google Scholar
- [26] . 2018. A multimodal anomaly detector for robot-assisted feeding using an LSTM-based variational autoencoder. IEEE Robotics and Automation Letters 3, 3 (2018), 1544–1551.Google ScholarCross Ref
- [27] . 2015. Unsupervised representation learning with deep convolutional generative adversarial networks. arXiv preprint arXiv:1511.06434 (2015).Google Scholar
- [28] . 2022. Hierarchical text-conditional image generation with clip latents. arXiv preprint arXiv:2204.06125 (2022).Google Scholar
- [29] . 2014. Data-driven simulation of complex multidimensional time series. ACM Transactions on Modeling and Computer Simulation 24, 1 (2014), Article 5, 13 pages.Google ScholarDigital Library
- [30] . 2017. Wasserstein auto-encoders. arXiv preprint arXiv:1711.01558 (2017).Google Scholar
- [31] . 2018. Process mining and simulation: A match made in heaven! In Proceedings of the 2018 Summer Simulation Conference (SummerSim’18). Article 4, 12 pages.Google Scholar
- [32] . 2018. Understanding the Lomb-Scargle periodogram. Astrophysical Journal Supplement Series 236, 1 (2018), 16.Google ScholarCross Ref
- [33] . 2017. MidiNet: A convolutional generative adversarial network for symbolic-domain music generation. arXiv preprint arXiv:1703.10847 (2017).Google Scholar
- [34] . 2021. Doubly stochastic generative arrivals modeling.
arxiv:2012.13940 [stat.ML] (2021).Google Scholar - [35] . 2018. Towards automatic learning of procedures from web instructional videos. In Proceedings of the 32nd AAAI Conference on Artificial Intelligence, the 30th Innovative Applications of Artificial Intelligence Conference, and the 8th AAAI Symposium on Educational Advances in Artificial Intelligence (AAAI’18/IAAI’18/EAAI’18). 7590–7598.Google Scholar
- [36] . 2021. Learning to simulate sequentially generated data via neural networks and Wasserstein training. In Proceedings of the 2021 Winter Simulation Conference (WSC’21). IEEE, Piscataway, NJ, 1–12.Google ScholarCross Ref
Index Terms
- NIM: Generative Neural Networks for Automated Modeling and Generation of Simulation Inputs
Recommendations
Revisiting multiple instance neural networks
We revisit the problem of solving MIL using neural networks (MINNs), which are ignored in current MIL research community. Our experiments show that MINNs are very effective and efficient.We proposed a novel MI-Net which is centered on learning bag ...
Clearly defined architectures of neural networks and multilayer perceptron
Neural networks with clearly defined architecture differ in the fact that they make it possible to determine the structure of neural network (number of neurons, layers, connections) on the basis of initial parameters of recognition problem. For these ...
Comments