Sharper Bounds for ℓ_p Sensitivity Sampling

06/01/2023
by   David P. Woodruff, et al.
0

In large scale machine learning, random sampling is a popular way to approximate datasets by a small representative subset of examples. In particular, sensitivity sampling is an intensely studied technique which provides provable guarantees on the quality of approximation, while reducing the number of examples to the product of the VC dimension d and the total sensitivity 𝔖 in remarkably general settings. However, guarantees going beyond this general bound of 𝔖 d are known in perhaps only one setting, for ℓ_2 subspace embeddings, despite intense study of sensitivity sampling in prior work. In this work, we show the first bounds for sensitivity sampling for ℓ_p subspace embeddings for p≠ 2 that improve over the general 𝔖 d bound, achieving a bound of roughly 𝔖^2/p for 1≤ p<2 and 𝔖^2-2/p for 2<p<∞. For 1≤ p<2, we show that this bound is tight, in the sense that there exist matrices for which 𝔖^2/p samples is necessary. Furthermore, our techniques yield further new results in the study of sampling algorithms, showing that the root leverage score sampling algorithm achieves a bound of roughly d for 1≤ p<2, and that a combination of leverage score and sensitivity sampling achieves an improved bound of roughly d^2/p𝔖^2-4/p for 2<p<∞. Our sensitivity sampling results yield the best known sample complexity for a wide class of structured matrices that have small ℓ_p sensitivity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2021

Active Sampling for Linear Regression Beyond the ℓ_2 Norm

We study active sampling algorithms for linear regression, which aim to ...
research
04/18/2023

New Subset Selection Algorithms for Low Rank Approximation: Offline and Online

Subset selection for the rank k approximation of an n× d matrix A offers...
research
09/13/2022

Fast Algorithms for Monotone Lower Subsets of Kronecker Least Squares Problems

Approximate solutions to large least squares problems can be computed ef...
research
11/04/2019

Importance Sampling via Local Sensitivity

Given a loss function F:X→R^+ that can be written as the sum of losses o...
research
02/19/2018

Tail bounds for volume sampled linear regression

The n × d design matrix in a linear regression problem is given, but the...
research
08/06/2023

Gradient Coding through Iterative Block Leverage Score Sampling

We generalize the leverage score sampling sketch for ℓ_2-subspace embedd...
research
03/30/2014

Sharpened Error Bounds for Random Sampling Based ℓ_2 Regression

Given a data matrix X ∈ R^n× d and a response vector y ∈ R^n, suppose n>...

Please sign up or login with your details

Forgot password? Click here to reset