Assistant Professor, Johns Hopkins University
2 papers at NeurIPS 2025
This paper introduces Proximal Diffusion Models (ProxDM), derived from backward discretization of an SDE and based on learned proximal operators, achieving provable faster sampling complexity and empirically much faster convergence.
We study the properties that make machine learning models generalize their performance across dimensions.