Multiple Descent: Design Your Own Generalization Curve

08/03/2020
by   Lin Chen, et al.
2

This paper explores the generalization loss of linear regression in variably parameterized families of models, both under-parameterized and over-parameterized. We show that the generalization curve can have an arbitrary number of peaks, and moreover, locations of those peaks can be explicitly controlled. Our results highlight the fact that both classical U-shaped generalization curve and the recently observed double descent curve are not intrinsic properties of the model family. Instead, their emergence is due to the interaction between the properties of the data and the inductive biases of learning algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/14/2019

The generalization error of random features regression: Precise asymptotics and double descent curve

Deep learning methods operate in regimes that defy the traditional stati...
research
12/28/2018

Reconciling modern machine learning and the bias-variance trade-off

The question of generalization in machine learning---how algorithms are ...
research
03/18/2019

Two models of double descent for weak features

The "double descent" risk curve was recently proposed to qualitatively d...
research
12/11/2020

Beyond Occam's Razor in System Identification: Double-Descent when Modeling Dynamics

System identification aims to build models of dynamical systems from dat...
research
06/08/2023

SGLD-Based Information Criteria and the Over-Parameterized Regime

Double-descent refers to the unexpected drop in test loss of a learning ...
research
08/17/2022

Superior generalization of smaller models in the presence of significant label noise

The benefits of over-parameterization in achieving superior generalizati...
research
09/25/2019

Benefit of Interpolation in Nearest Neighbor Algorithms

The over-parameterized models attract much attention in the era of data ...

Please sign up or login with your details

Forgot password? Click here to reset