Efficiently Learning Adversarially Robust Halfspaces with Noise

05/15/2020
by   Omar Montasser, et al.
0

We study the problem of learning adversarially robust halfspaces in the distribution-independent setting. In the realizable setting, we provide necessary and sufficient conditions on the adversarial perturbation sets under which halfspaces are efficiently robustly learnable. In the presence of random label noise, we give a simple computationally efficient algorithm for this problem with respect to any ℓ_p-perturbation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset