Assistant Professor, Texas A&M University - College Station
2 papers at NeurIPS 2025
An algorithm with theoretically guaranteed convergence rate to optimize a preference function over the Pareto set of a given set of objectives.
We propose a new class of vector transport based gradient descent enabling silver stepsize acceleration on Riemannian manifolds, yielding provable accelerated gradient methods for potential functional optimization in Wasserstein space.