Low Rank Approximation Directed by Leverage Scores and Computed at Sub-linear Cost

06/10/2019
by   Victor Y. Pan, et al.
0

Low rank approximation (LRA) of a matrix is a major subject of matrix and tensor computations and data mining and analysis. It is desired (and even imperative in applications to Big Data) to solve the problem at sub-linear cost, involving much fewer memory cells and arithmetic operations than an input matrix has entries, but this is impossible even for a small matrix family of our Appendix. Nevertheless we prove that this is possible with a high probability (whp) for random matrices admitting LRA. Namely we recall the known randomized algorithms that solve the LRA problem whp for any matrix admitting LRA by relying on the computation of the so called leverage scores. That computation has super-linear cost, but we simplify the solution algorithm and run it at sub-linear cost by trivializing the computation of leverage scores. Then we prove that whp the resulting algorithms output accurate LRA of a random input matrix admitting LRA.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset