Postdoc, University of Oxford
3 papers at NeurIPS 2025
We build a nearly-perfect predictive model for memorization in diffusion models using theory and controlled experiments.
We demonstrate the stability of Langevin diffusion and use it to derive the first proof of convergence for Proximal Stochastic Gradient Langevin Algorithm in a non-convex setting.
We propose a diffusion-based approach to sampling from Boltzmann densities based on temperature annealing.