Postdoc, University of California, Davis
2 papers at NeurIPS 2025
We propose a novel transfer learning framework for regression where outputs are probability distributions residing in the Wasserstein space.
We propose FGBoost, a gradient boosting framework designed to intrinsically model complex regression relationships with non-Euclidean outputs in geodesic metric spaces.