DeepAI AI Chat
Log In Sign Up

Statistical and Computational Trade-Offs in Kernel K-Means

08/27/2019
by   Daniele Calandriello, et al.
8

We investigate the efficiency of k-means in terms of both statistical and computational requirements. More precisely, we study a Nyström approach to kernel k-means. We analyze the statistical properties of the proposed method and show that it achieves the same accuracy of exact kernel k-means with only a fraction of computations. Indeed, we prove under basic assumptions that sampling √(n) Nyström landmarks allows to greatly reduce computational costs without incurring in any loss of accuracy. To the best of our knowledge this is the first result of this kind for unsupervised learning.

READ FULL TEXT
03/09/2020

Nearly Optimal Risk Bounds for Kernel K-Means

In this paper, we study the statistical properties of the kernel k-means...
07/11/2019

Gain with no Pain: Efficient Kernel-PCA by Nyström Sampling

In this paper, we propose and study a Nyström based approach to efficien...
09/19/2020

Kernel Ridge Regression Using Importance Sampling with Application to Seismic Response Prediction

Scalable kernel methods, including kernel ridge regression, often rely o...
11/12/2020

Kernel k-Means, By All Means: Algorithms and Strong Consistency

Kernel k-means clustering is a powerful tool for unsupervised learning o...
10/11/2018

On Kernel Derivative Approximation with Random Fourier Features

Random Fourier features (RFF) represent one of the most popular and wide...
02/14/2016

Generalization Properties of Learning with Random Features

We study the generalization properties of ridge regression with random f...
12/07/2020

LCS Graph Kernel Based on Wasserstein Distance in Longest Common Subsequence Metric Space

For graph classification tasks, many methods use a common strategy to ag...