Estimating a regression function in exponential families by model selection
Let X_1=(W_1,Y_1),…,X_n=(W_n,Y_n) be n pairs of independent random variables. We assume that, for each i∈{1,…,n}, the conditional distribution of Y_i given W_i belongs to a one-parameter exponential family with parameter γ^⋆(W_i)∈ℝ, or at least, is close enough to a distribution of this form. The objective of the present paper is to estimate these conditional distributions on the basis of the observation X=(X_1,…,X_n) and to do so, we propose a model selection procedure together with a non-asymptotic risk bound for the resulted estimator with respect to a Hellinger-type distance. When γ^⋆ does exist, the procedure allows to obtain an estimator γ of γ^⋆ adapted to a wide range of the anisotropic Besov spaces. When γ^⋆ has a general additive or multiple index structure, we construct suitable models and show the resulted estimators by our procedure based on such models can circumvent the curse of dimensionality. Moreover, we consider model selection problems for ReLU neural networks and provide an example where estimation based on neural networks enjoys a much faster converge rate than the classical models. Finally, we apply this procedure to solve variable selection problem in exponential families. The proofs in the paper rely on bounding the VC dimensions of several collections of functions, which can be of independent interest.
READ FULL TEXT