Efficient Dataset Distillation Using Random Feature Approximation

10/21/2022
by   Noel Loo, et al.
0

Dataset distillation compresses large datasets into smaller synthetic coresets which retain performance with the aim of reducing the storage and computational burden of processing the entire dataset. Today's best-performing algorithm, Kernel Inducing Points (KIP), which makes use of the correspondence between infinite-width neural networks and kernel-ridge regression, is prohibitively slow due to the exact computation of the neural tangent kernel matrix, scaling O(|S|^2), with |S| being the coreset size. To improve this, we propose a novel algorithm that uses a random feature approximation (RFA) of the Neural Network Gaussian Process (NNGP) kernel, which reduces the kernel matrix computation to O(|S|). Our algorithm provides at least a 100-fold speedup over KIP and can run on a single GPU. Our new method, termed an RFA Distillation (RFAD), performs competitively with KIP and other dataset condensation algorithms in accuracy over a range of large-scale datasets, both in kernel regression and finite-width network training. We demonstrate the effectiveness of our approach on tasks involving model interpretability and privacy preservation.

READ FULL TEXT

page 9

page 21

research
10/30/2020

Dataset Meta-Learning from Kernel Ridge-Regression

One of the most fundamental aspects of any machine learning algorithm is...
research
02/13/2023

Dataset Distillation with Convexified Implicit Gradients

We propose a new dataset distillation algorithm using reparameterization...
research
05/23/2023

On the Size and Approximation Error of Distilled Sets

Dataset Distillation is the task of synthesizing small datasets from lar...
research
02/02/2023

Dataset Distillation Fixes Dataset Reconstruction Attacks

Modern deep learning requires large volumes of data, which could contain...
research
01/31/2018

Kernel Distillation for Gaussian Processes

Gaussian processes (GPs) are flexible models that can capture complex st...
research
10/22/2019

Kernel computations from large-scale random features obtained by Optical Processing Units

Approximating kernel functions with random features (RFs)has been a succ...
research
03/09/2023

Kernel Regression with Infinite-Width Neural Networks on Millions of Examples

Neural kernels have drastically increased performance on diverse and non...

Please sign up or login with your details

Forgot password? Click here to reset