Assistant Professor, CMU, Carnegie Mellon University
2 papers at NeurIPS 2025
We introduce a framework for training accelerated, few-step generative models that includes consistency models, shortcut models, and mean flow as special cases.
Learning likelihoods instead of calculating expensive jacobians on generative ODEs to calculate free energy differences and obtain the Boltzmann distribution.