Communication-efficient distributed eigenspace estimation

09/05/2020
by   Vasileios Charisopoulos, et al.
0

Distributed computing is a standard way to scale up machine learning and data science algorithms to process large amounts of data. In such settings, avoiding communication amongst machines is paramount for achieving high performance. Rather than distribute the computation of existing algorithms, a common practice for avoiding communication is to compute local solutions or parameter estimates on each machine and then combine the results; in many convex optimization problems, even simple averaging of local solutions can work well. However, these schemes do not work when the local solutions are not unique. Spectral methods are a collection of such problems, where solutions are orthonormal bases of the leading invariant subspace of an associated data matrix, which are only unique up to rotation and reflections. Here, we develop a communication-efficient distributed algorithm for computing the leading invariant subspace of a data matrix. Our algorithm uses a novel alignment scheme that minimizes the Procrustean distance between local solutions and a reference solution, and only requires a single round of communication. For the important case of principal component analysis (PCA), we show that our algorithm achieves a similar error rate to that of a centralized estimator. We present numerical experiments demonstrating the efficacy of our proposed algorithm for distributed PCA, as well as other problems where solutions exhibit rotational symmetry, such as node embeddings for graph data and spectral initialization for quadratic sensing.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/06/2020

A Communication-Efficient Distributed Algorithm for Kernel Principal Component Analysis

Principal Component Analysis (PCA) is a fundamental technology in machin...
research
04/05/2020

Distributed Estimation for Principal Component Analysis: a Gap-free Approach

The growing size of modern data sets brings many challenges to the exist...
research
01/05/2021

A Linearly Convergent Algorithm for Distributed Principal Component Analysis

Principal Component Analysis (PCA) is the workhorse tool for dimensional...
research
10/27/2021

Distributed Principal Component Analysis with Limited Communication

We study efficient distributed algorithms for the fundamental problem of...
research
10/10/2022

Spectral Sparsification for Communication-Efficient Collaborative Rotation and Translation Estimation

We propose fast and communication-efficient distributed algorithms for r...
research
07/18/2019

Federated PCA with Adaptive Rank Estimation

In many online machine learning and data science tasks such as data summ...
research
09/02/2017

Communication-efficient Algorithm for Distributed Sparse Learning via Two-way Truncation

We propose a communicationally and computationally efficient algorithm f...

Please sign up or login with your details

Forgot password? Click here to reset