Fast, Distribution-free Predictive Inference for Neural Networks with Coverage Guarantees

06/11/2023
by   Yue Gao, et al.
0

This paper introduces a novel, computationally-efficient algorithm for predictive inference (PI) that requires no distributional assumptions on the data and can be computed faster than existing bootstrap-type methods for neural networks. Specifically, if there are n training samples, bootstrap methods require training a model on each of the n subsamples of size n-1; for large models like neural networks, this process can be computationally prohibitive. In contrast, our proposed method trains one neural network on the full dataset with (ϵ, δ)-differential privacy (DP) and then approximates each leave-one-out model efficiently using a linear approximation around the differentially-private neural network estimate. With exchangeable data, we prove that our approach has a rigorous coverage guarantee that depends on the preset privacy parameters and the stability of the neural network, regardless of the data distribution. Simulations and experiments on real data demonstrate that our method satisfies the coverage guarantees with substantially reduced computation compared to bootstrap methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset