Block-Randomized Stochastic Proximal Gradient for Low-Rank Tensor Factorization

01/16/2019
by   Xiao Fu, et al.
0

This work considers the problem of computing the canonical polyadic decomposition (CPD) of large tensors. Prior works mostly leverage data sparsity to handle this problem, which are not suitable for handling dense tensors that often arise in applications such as medical imaging, computer vision, and remote sensing. Stochastic optimization is known for its low memory cost and per-iteration complexity when handling dense data. However, existing stochastic CPD algorithms are hard to incorporate a variety of constraints and regularizations that are of interest in signal and data analytics. Convergence properties of many such algorithms are also unclear. In this work, we propose a stochastic optimization framework for large-scale CPD with constraints/regularizations. The framework works under a doubly randomized fashion, and can be regarded as a judicious combination of randomized block coordinate descent (BCD) and stochastic proximal gradient (SPG). The algorithm enjoys lightweight updates and small memory footprint, and thus scales well. In addition, this framework entails considerable flexibility---many frequently used regularizers and constraints can be readily handled under the proposed scheme. The approach is also supported by convergence analysis. Numerical results on large-scale dense tensors are employed to showcase the effectiveness of the proposed approach.

READ FULL TEXT
research
03/29/2023

Block-Randomized Stochastic Methods for Tensor Ring Decomposition

Tensor ring (TR) decomposition is a simple but effective tensor network ...
research
04/29/2021

Stochastic Mirror Descent for Low-Rank Tensor Decomposition Under Non-Euclidean Losses

This work considers low-rank canonical polyadic decomposition (CPD) unde...
research
04/24/2018

Structured SUMCOR Multiview Canonical Correlation Analysis for Large-Scale Data

The sum-of-correlations (SUMCOR) formulation of generalized canonical co...
research
03/06/2021

Block-Randomized Gradient Descent Methods with Importance Sampling for CP Tensor Decomposition

This work considers the problem of computing the CANDECOMP/PARAFAC (CP) ...
research
06/15/2020

Nonconvex Optimization Tools for Large-Scale Matrix and Tensor Decomposition with Structured Factors

The proposed article aims at offering a comprehensive tutorial for the c...
research
06/26/2015

Finding Linear Structure in Large Datasets with Scalable Canonical Correlation Analysis

Canonical Correlation Analysis (CCA) is a widely used spectral technique...
research
07/02/2017

Stochastic Configuration Networks Ensemble for Large-Scale Data Analytics

This paper presents a fast decorrelated neuro-ensemble with heterogeneou...

Please sign up or login with your details

Forgot password? Click here to reset