2 papers across 2 sessions
We construct a family of smooth convex loss with linear surrogate regret bounds via infimal convolution and Fenchel–Young losses, overcoming the smoothness–regret trade-off for arbitrary discrete target losses.
We derive high probability excess risk bounds to at most $\tilde{O}(1/n^2)$ for ERM, GD and SGD and our high probability results on the generalization error of gradients for nonconvex problems are also the sharpest.