Decidability of Sample Complexity of PAC Learning in finite setting

02/26/2020
by   Alberto Gandolfi, et al.
0

In this short note we observe that the sample complexity of PAC machine learning of various concepts, including learning the maximum (EMX), can be exactly determined when the support of the probability measures considered as models satisfies an a-priori bound. This result contrasts with the recently discovered undecidability of EMX within ZFC for finitely supported probabilities (with no a priori bound). Unfortunately, the decision procedure is at present, at least doubly exponential in the number of points times the uniform bound on the support size.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/02/2015

The Optimal Sample Complexity of PAC Learning

This work establishes a new upper bound on the number of samples suffici...
research
03/01/2019

From PAC to Instance-Optimal Sample Complexity in the Plackett-Luce Model

We consider PAC learning for identifying a good item from subset-wise sa...
research
02/25/2020

A short note on learning discrete distributions

The goal of this short note is to provide simple proofs for the "folklor...
research
05/12/2022

Sample Complexity Bounds for Robustly Learning Decision Lists against Evasion Attacks

A fundamental problem in adversarial machine learning is to quantify how...
research
10/05/2018

Sample Complexity of Sinkhorn divergences

Optimal transport (OT) and maximum mean discrepancies (MMD) are now rout...
research
08/22/2023

Consistency-Checking Problems: A Gateway to Parameterized Sample Complexity

Recently, Brand, Ganian and Simonov introduced a parameterized refinemen...
research
09/10/2021

PAC Mode Estimation using PPR Martingale Confidence Sequences

We consider the problem of correctly identifying the mode of a discrete ...

Please sign up or login with your details

Forgot password? Click here to reset