Variational Bayesian Inference for Mixed Logit Models with Unobserved Inter- and Intra-Individual Heterogeneity
Variational Bayes (VB) methods have emerged as a fast and computationally-efficient alternative to Markov chain Monte Carlo (MCMC) methods for Bayesian estimation of mixed logit models. In this paper, we derive a VB method for posterior inference in mixed multinomial logit models with unobserved inter- and intra-individual heterogeneity. The proposed VB method is benchmarked against MCMC in a simulation study. The results suggest that VB is substantially faster than MCMC but also noticeably less accurate, because the mean-field assumption of VB is too restrictive. Future research should thus focus on enhancing the expressiveness and flexibility of the variational approximation.
READ FULL TEXT