logo
today local_bar
Poster Session 4 · Thursday, December 4, 2025 4:30 PM → 7:30 PM
#707

Convergence of Clipped SGD on Convex -Smooth Functions

NeurIPS OpenReview

Abstract

We study stochastic gradient descent (SGD) with gradient clipping on convex functions under a generalized smoothness assumption called -smoothness.
Using gradient clipping, we establish a high probability convergence rate that matches the SGD rate in the smooth case up to polylogarithmic factors and additive terms. We also propose a variation of adaptive SGD with gradient clipping, which achieves the same guarantee.
We perform empirical experiments to examine our theory and algorithmic choices.