Statistical Learning under Heterogenous Distribution Shift

02/27/2023
βˆ™
by   Max Simchowitz, et al.
βˆ™
0
βˆ™

This paper studies the prediction of a target 𝐳 from a pair of random variables (𝐱,𝐲), where the ground-truth predictor is additive 𝔼[𝐳|𝐱,𝐲] = f_⋆(𝐱) +g_⋆(𝐲). We study the performance of empirical risk minimization (ERM) over functions f+g, f βˆˆβ„± and g βˆˆπ’’, fit on a given training distribution, but evaluated on a test distribution which exhibits covariate shift. We show that, when the class β„± is "simpler" than 𝒒 (measured, e.g., in terms of its metric entropy), our predictor is more resilient to heterogenous covariate shifts in which the shift in 𝐱 is much greater than that in 𝐲. These results rely on a novel HΓΆlder style inequality for the Dudley integral which may be of independent interest. Moreover, we corroborate our theoretical findings with experiments demonstrating improved resilience to shifts in "simpler" features across numerous domains.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset