ISL Colloquium

← List all talks ...

Leveraging 'partial' smoothness for faster convergence in nonsmooth optimization

Damek Davis – Associate Professor, Cornell

Thu, 3-Nov-2022 / 4:00pm / Packard 202

Abstract

First-order methods in nonsmooth optimization are often described as “slow.” I will present two (locally) accelerated first-order methods that violate this perception: a superlinearly convergent method for solving nonsmooth equations, and a linearly convergent method for solving “generic” nonsmooth optimization problems. The key insight in both cases is that nonsmooth functions are often “partially” smooth in useful ways.

Bio

Damek Davis is an Associate Professor of Operations Research at Cornell University. His research focuses on the interplay of optimization, signal processing, statistics, and machine learning. He has received several awards for his work, including a Sloan Research Fellowship in Mathematics (2020), the INFORMS Optimization Society Young Researchers Prize (2019), and an NSF CAREER Award (2021).