logo
today local_bar
Poster Session 2 West
Wednesday, December 11, 2024 4:30 PM → 7:30 PM
Poster #5904

Nesterov acceleration despite very noisy gradients

Kanan Gupta, Jonathan W. Siegel, Stephan Wojtowytsch
Poster

Abstract

We present a generalization of Nesterov's accelerated gradient descent algorithm. Our algorithm (AGNES) provably achieves acceleration for smooth convex and strongly convex minimization tasks with noisy gradient estimates if the noise intensity is proportional to the magnitude of the gradient at every point. Nesterov's method converges at an accelerated rate if the constant of proportionality is below 1, while AGNES accommodates any signal-to-noise ratio. The noise model is motivated by applications in overparametrized machine learning. AGNES requires only two parameters in convex and three in strongly convex minimization tasks, improving on existing methods. We further provide clear geometric interpretations and heuristics for the choice of parameters.