DinTucker: Scaling up Gaussian process models on multidimensional arrays with billions of elements

11/12/2013
by   Shandian Zhe, et al.
0

Infinite Tucker Decomposition (InfTucker) and random function prior models, as nonparametric Bayesian models on infinite exchangeable arrays, are more powerful models than widely-used multilinear factorization methods including Tucker and PARAFAC decomposition, (partly) due to their capability of modeling nonlinear relationships between array elements. Despite their great predictive performance and sound theoretical foundations, they cannot handle massive data due to a prohibitively high training time. To overcome this limitation, we present Distributed Infinite Tucker (DINTUCKER), a large-scale nonlinear tensor decomposition algorithm on MAPREDUCE. While maintaining the predictive accuracy of InfTucker, it is scalable on massive data. DINTUCKER is based on a new hierarchical Bayesian model that enables local training of InfTucker on subarrays and information integration from all local training results. We use distributed stochastic gradient descent, coupled with variational inference, to train this model. We apply DINTUCKER to multidimensional arrays with billions of elements from applications in the "Read the Web" project (Carlson et al., 2010) and in information security and compare it with the state-of-the-art large-scale tensor decomposition method, GigaTensor. On both datasets, DINTUCKER achieves significantly higher prediction accuracy with less computational time.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2015

Large-Scale Distributed Bayesian Matrix Factorization using Stochastic Gradient MCMC

Despite having various attractive qualities such as high prediction accu...
research
04/27/2016

Distributed Flexible Nonlinear Tensor Factorization

Tensor factorization is a powerful tool to analyse multi-way data. Compa...
research
03/22/2016

Information Processing by Nonlinear Phase Dynamics in Locally Connected Arrays

Research toward powerful information processing systems that circumvent ...
research
02/11/2020

Large Scale Tensor Regression using Kernels and Variational Inference

We outline an inherent weakness of tensor factorization models when late...
research
09/29/2014

A Bayesian Tensor Factorization Model via Variational Inference for Link Prediction

Probabilistic approaches for tensor factorization aim to extract meaning...
research
11/03/2018

Large-scale Heteroscedastic Regression via Gaussian Process

Heteroscedastic regression which considers varying noises across input d...
research
01/28/2023

PROTES: Probabilistic Optimization with Tensor Sampling

We develop new method PROTES for optimization of the multidimensional ar...

Please sign up or login with your details

Forgot password? Click here to reset