Scaling Bayesian inference of mixed multinomial logit models to large datasets
Variational inference methods have been shown to lead to significant improvements in the computational efficiency of approximate Bayesian inference in mixed multinomial logit models when compared to standard Markov-chain Monte Carlo (MCMC) methods without increasing estimation bias. However, despite...
Gespeichert in:
Veröffentlicht in: | Transportation research. Part B: methodological 2022-04, Vol.158, p.1-17 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Variational inference methods have been shown to lead to significant improvements in the computational efficiency of approximate Bayesian inference in mixed multinomial logit models when compared to standard Markov-chain Monte Carlo (MCMC) methods without increasing estimation bias. However, despite their demonstrated efficiency gains, existing methods still suffer from important limitations that prevent them to scale to large datasets, while providing the flexibility to allow for rich prior distributions and to capture complex posterior distributions. To effectively scale Bayesian inference in Mixed Multinomial Logit models to large datasets, we propose an Amortized Variational Inference approach that leverages stochastic backpropagation, automatic differentiation and GPU-accelerated computation. Moreover, we show how normalizing flows can be used to increase the flexibility of the variational posterior approximations. Through an extensive simulation study and real data for transport mode choice from London, we empirically show that the proposed approach is able to achieve computational speedups of multiple orders of magnitude over traditional maximum simulated likelihood estimation (MSLE) and MCMC approaches for large datasets without compromising estimation accuracy.
•We propose an Amortized Variational Inference approach for Mixed Multinomial Logit.•It uses stochastic backpropagation, automatic differentiation and GPU-acceleration.•Normalizing flows can increase the flexibility of the variational approximations.•Computational speedups of orders of magnitude over MSLE and MCMC approaches.•Scales to very large datasets without compromising estimation accuracy. |
---|---|
ISSN: | 0191-2615 1879-2367 |
DOI: | 10.1016/j.trb.2022.01.005 |