A combined entropy and utility based generative model for large scale multiple discrete-continuous travel behaviour data

01/18/2019
by   Melvin Wong, et al.
0

Generative models, either by simple clustering algorithms or deep neural network architecture, have been developed as a probabilistic estimation method for dimension reduction or to model the underlying properties of data structures. Although their apparent use has largely been limited to image recognition and classification, generative machine learning algorithms can be a powerful tool for travel behaviour research. In this paper, we examine the generative machine learning approach for analyzing multiple discrete-continuous (MDC) travel behaviour data to understand the underlying heterogeneity and correlation, increasing the representational power of such travel behaviour models. We show that generative models are conceptually similar to choice selection behaviour process through information entropy and variational Bayesian inference. Specifically, we consider a restricted Boltzmann machine (RBM) based algorithm with multiple discrete-continuous layer, formulated as a variational Bayesian inference optimization problem. We systematically describe the proposed machine learning algorithm and develop a process of analyzing travel behaviour data from a generative learning perspective. We show parameter stability from model analysis and simulation tests on an open dataset with multiple discrete-continuous dimensions and a size of 293,330 observations. For interpretability, we derive analytical methods for conditional probabilities as well as elasticities. Our results indicate that latent variables in generative models can accurately represent joint distribution consistently w.r.t multiple discrete-continuous variables. Lastly, we show that our model can generate statistically similar data distributions for travel forecasting and prediction.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset