Poster Session 3 · Thursday, December 4, 2025 11:00 AM → 2:00 PM
#702
Stochastic Momentum Methods for Non-smooth Non-Convex Finite-Sum Coupled Compositional Optimization
Abstract
Finite-sum Coupled Compositional Optimization (FCCO), characterized by its coupled compositional objective structure, emerges as an important optimization paradigm for addressing a wide range of machine learning problems. In this paper, we focus on a challenging class of non-convex non-smooth FCCO, where the outer functions are non-smooth weakly convex or convex and the inner functions are smooth or weakly convex.
Existing state-of-the-art result face two key limitations:
- a high iteration complexity of under the assumption that the stochastic inner functions are Lipschitz continuous in expectation;
- reliance on vanilla SGD-type updates, which are not suitable for deep learning applications.
Our main contributions are two fold:
- We propose stochastic momentum methods tailored for non-smooth FCCO that come with provable convergence guarantees;
- We establish a new state-of-the-art iteration complexity of .
Moreover, we apply our algorithms to multiple inequality constrained non-convex optimization problems involving smooth or weakly convex functional inequality constraints. By optimizing a smoothed hinge penalty based formulation, we achieve a new state-of-the-art complexity of for finding an (nearly) -level KKT solution. Experiments on three tasks demonstrate the effectiveness of the proposed algorithms.