Full Professor, Southeast University
2 papers at NeurIPS 2025
We construct a family of smooth convex loss with linear surrogate regret bounds via infimal convolution and Fenchel–Young losses, overcoming the smoothness–regret trade-off for arbitrary discrete target losses.