Spectral approximations in machine learning

07/21/2011
by   Darren Homrighausen, et al.
0

In many areas of machine learning, it becomes necessary to find the eigenvector decompositions of large matrices. We discuss two methods for reducing the computational burden of spectral decompositions: the more venerable Nystom extension and a newly introduced algorithm based on random projections. Previous work has centered on the ability to reconstruct the original matrix. We argue that a more interesting and relevant comparison is their relative performance in clustering and classification tasks using the approximate eigenvectors as features. We demonstrate that performance is task specific and depends on the rank of the approximation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/18/2022

Bidiagonal Decompositions of Vandermonde-Type Matrices of Arbitrary Rank

We present a method to derive new explicit expressions for bidiagonal de...
research
06/29/2022

Generalized Pseudoskeleton Decompositions

We characterize some variations of pseudoskeleton (also called CUR) deco...
research
03/19/2021

Mode-wise Tensor Decompositions: Multi-dimensional Generalizations of CUR Decompositions

Low rank tensor approximation is a fundamental tool in modern machine le...
research
01/08/2020

Stability of Sampling for CUR Decompositions

This article studies how to form CUR decompositions of low-rank matrices...
research
04/26/2021

Invariant polynomials and machine learning

We present an application of invariant polynomials in machine learning. ...
research
10/05/2021

Efficient GPU implementation of randomized SVD and its applications

Matrix decompositions are ubiquitous in machine learning, including appl...

Please sign up or login with your details

Forgot password? Click here to reset