ISL Colloquium

← List all talks ...

Two vignettes about interpolation and generalization in overparameterized models

Niladri Chatterji – Postdoc, Stanford

Thu, 28-Apr-2022 / 4:00pm / Packard 101

Abstract

There has been a surge of recent interest in the generalization ability of interpolating overparameterized models trained with first-order optimization methods. In this talk, I shall present two research vignettes on this topic.

First, I shall present results about the generalization error of two-layer neural networks trained to interpolation by gradient descent on the logistic loss following random initialization. We assume the data comes from well-separated class-conditional log-concave distributions and allow for a constant fraction of the training labels to be corrupted by an adversary. We show that in this setting, neural networks exhibit benign overfitting: they can be driven to zero training error, perfectly fitting any noisy training labels, and simultaneously achieve test error close to the Bayes-optimal error.

Second, I shall present lower bounds on the excess risk of sparse interpolating procedures for linear regression with Gaussian data in the overparameterized regime. This result shows that the excess risk of basis pursuit (the minimum L1-norm interpolant) can converge at an exponentially slower rate than OLS (the minimum L2-norm interpolant), even when the ground truth is sparse. Our analysis exposes the benefit of an effect analogous to the “wisdom of the crowd”, except here the harm arising from fitting the noise is ameliorated by spreading it among many directions.

This talk is based on joint work with wonderful co-authors Peter Bartlett, Spencer Frei and Philip Long.

Bio

Niladri S. Chatterji is a Stanford SAIL postdoctoral researcher in the Computer Science department advised by Tatsunori Hashimoto and Percy Liang. His work spans the fields of statistical learning theory, optimization and online learning. He previously received his PhD in Physics at the University of California Berkeley under the watchful eye of Peter Bartlett. Before that, he received his undergraduate degree in Engineering Physics from the Indian Institute of Technology at Bombay.