Extreme Stochastic Variational Inference: Distributed and Asynchronous

05/31/2016
by   Jiong Zhang, et al.
0

We propose extreme stochastic variational inference (ESVI), an asynchronous and lock-free algorithm to perform variational inference on massive real world datasets. Stochastic variational inference (SVI), the state-of-the-art algorithm for scaling variational inference to large-datasets, is inherently serial. Moreover, it requires the parameters to fit in the memory of a single processor; this is problematic when the number of parameters is in billions. ESVI overcomes these limitations by requiring that each processor only access a subset of the data and a subset of the parameters, thus providing data and model parallelism simultaneously. We demonstrate the effectiveness of ESVI by running Latent Dirichlet Allocation (LDA) on UMBC-3B, a dataset that has a vocabulary of 3 million and a token size of 3 billion. To best of our knowledge, this is an order of magnitude larger than the largest dataset on which results using variational inference have been reported in literature. In our experiments, we found that ESVI outperforms VI and SVI, and also achieves a better quality solution. In addition, we propose a strategy to speed up computation and save memory when fitting large number of topics.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset