Group-regularized ridge regression via empirical Bayes noise level cross-validation

10/29/2020
by   Nikolaos Ignatiadis, et al.
0

Features in predictive models are not exchangeable, yet common supervised models treat them as such. Here we study ridge regression when the analyst can partition the features into K groups based on external side-information. For example, in high-throughput biology, features may represent gene expression, protein abundance or clinical data and so each feature group represents a distinct modality. The analyst's goal is to choose optimal regularization parameters λ = (λ_1, …, λ_K) – one for each group. In this work, we study the impact of λ on the predictive risk of group-regularized ridge regression by deriving limiting risk formulae under a high-dimensional random effects model with p≍ n as n →∞. Furthermore, we propose a data-driven method for choosing λ that attains the optimal asymptotic risk: The key idea is to interpret the residual noise variance σ^2, as a regularization parameter to be chosen through cross-validation. An empirical Bayes construction maps the one-dimensional parameter σ to the K-dimensional vector of regularization parameters, i.e., σ↦λ(σ). Beyond its theoretical optimality, the proposed method is practical and runs as fast as cross-validated ridge regression without feature groups (K=1).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset