PhD student, University of Southern California
2 papers at NeurIPS 2025
We prove improved bounds for swap multicalibration, swap omniprediction, and swap agnostic learning in both the distributional and online settings.
We propose a novel calibration measure, referred to as (pseudo) KL-Calibration, and leverage it to establish several new bounds on the swap regret for a range of significant loss functions.