2 papers across 2 sessions
We propose a new class of vector transport based gradient descent enabling silver stepsize acceleration on Riemannian manifolds, yielding provable accelerated gradient methods for potential functional optimization in Wasserstein space.
We propose FGBoost, a gradient boosting framework designed to intrinsically model complex regression relationships with non-Euclidean outputs in geodesic metric spaces.