Low Rank Approximation of a Matrix at Sub-linear Cost

07/21/2019
by   Victor Y. Pan, et al.
0

A matrix algorithm performs at sub-linear cost if it uses much fewer flops and memory cells than the input matrix has entries. Using such algorithms is indispensable for Big Data Mining and Analysis, where the input matrices are so immense that one can only access a small fraction of all their entries. Typically, however, these matrices admit their LRA (Low Rank Approximation), which one can access and process at sub-linear arithmetic cost, that is, by involving much fewer memory cells and arithmetic operations than an input matrix has entries. Can, however, we compute LRA at sub-linear cost? Adversary argument shows that no algorithm running at sub-linear cost can output accurate LRA of the worst case input matrices, or even of the matrices of small families of our Appendix, but for more than a decade Cross-Approximation iterations, running at sub-linear cost, have routinely been computing accurate LRA. We partly resolve that long-known contradiction by proving that already a single two-stage Cross-Approximation loop computes reasonably close LRA of any matrix close to a matrix of sufficiently low rank provided that the loop begins at a submatrix that shares its numerical rank with an input matrix.We cannot obtain such an initial submatrix for the worst case input matrix without accessing all or most of its entries, but have this luck with a high probability for any choice from a random input matrix and increase our chances for success with every new C-A iteration. All this should explain the well-known empirical power of C-A iterations applied to real world inputs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/10/2019

Refinement of Low Rank Approximation of a Matrix at Sub-linear Cost

Low rank approximation (LRA) of a matrix is a hot subject of modern comp...
research
06/11/2019

Low Rank Approximation at Sublinear Cost by Means of Subspace Sampling

Low Rank Approximation (LRA) of a matrix is a hot research subject, fund...
research
06/10/2019

Low Rank Approximation Directed by Leverage Scores and Computed at Sub-linear Cost

Low rank approximation (LRA) of a matrix is a major subject of matrix an...
research
06/02/2019

Sample-Optimal Low-Rank Approximation of Distance Matrices

A distance matrix A ∈ R^n × m represents all pairwise distances, A_ij=d(...
research
10/30/2019

Learning-Based Low-Rank Approximations

We introduce a "learning-based" algorithm for the low-rank decomposition...
research
11/04/2014

A random algorithm for low-rank decomposition of large-scale matrices with missing entries

A Random SubMatrix method (RSM) is proposed to calculate the low-rank de...
research
07/16/2021

Single Pass Entrywise-Transformed Low Rank Approximation

In applications such as natural language processing or computer vision, ...

Please sign up or login with your details

Forgot password? Click here to reset