PhD student, Stanford University
2 papers at NeurIPS 2025
We introduce a new gradient noise model for stochastic convex optimization and apply it to achieve state-of-the-art rates in both quantum and non-quantum settings.
We develop optimization methods which offer new trade-offs between the number of gradient and Hessian computations needed to compute the critical point of a non-convex function