Associate Professor, Georgia Institute of Technology
6 papers at NeurIPS 2025
We made discrete diffusion model inference faster with high-order methods, both theoretically and empirically.
Variational Learning Finds Flatter Solutions at the Edge of Stability
We propose OLLA, a projection-free overdamped Langevin framework that enforces both equality and inequality constraints via a deterministic “landing” correction along the manifold normal.
We introduce the Non-equilibrium Annealed Adjoint Sampler (NAAS), a novel SOC-based diffusion sampler that leverages annealed reference dynamics