Researcher, Google
2 papers at NeurIPS 2025
We introduce a new gradient noise model for stochastic convex optimization and apply it to achieve state-of-the-art rates in both quantum and non-quantum settings.
We provide a method of preconditioning sequential data and formally prove it improves the regret bound of several methods and experimentally show it improves the performance of many models.