Low-degree learning and the metric entropy of polynomials

03/17/2022
by   Alexandros Eskenazis, et al.
0

Let ℱ_n,d be the class of all functions f:{-1,1}^n→[-1,1] on the n-dimensional discrete hypercube of degree at most d. In the first part of this paper, we prove that any (deterministic or randomized) algorithm which learns ℱ_n,d with L_2-accuracy ε requires at least Ω((1-√(ε))2^dlog n) queries for large enough n, thus establishing the sharpness as n→∞ of a recent upper bound of Eskenazis and Ivanisvili (2021). To do this, we show that the L_2-packing numbers 𝖬(ℱ_n,d,·_L_2,ε) of the concept class ℱ_n,d satisfy the two-sided estimate c(1-ε)2^dlog n ≤log𝖬(ℱ_n,d,·_L_2,ε) ≤2^Cdlog n/ε^4 for large enough n, where c, C>0 are universal constants. In the second part of the paper, we present a logarithmic upper bound for the randomized query complexity of classes of bounded approximate polynomials whose Fourier spectra are concentrated on few subsets. As an application, we prove new estimates for the number of random queries required to learn approximate juntas of a given degree, functions with rapidly decaying Fourier tails and constant depth circuits of given size. Finally, we obtain bounds for the number of queries required to learn the polynomial class ℱ_n,d without error in the query and random example models.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset