Researcher, Google
3 papers at NeurIPS 2025
We propose the Koopman Distillation Model (KDM), a novel offline distillation method for diffusion models that leverages Koopman theory to enable single-step generation with strong semantic consistency and state-of-the-art FID performance.
The first benchmark for multi-factor sequential disentanglement representations, introduces a novel method, and leverages Vision-Language Models to automate annotation and evaluation—enabling scalable, label-free workflows.
We tackle the challenge of few-shot time series generation by proposing a unified pretrained model that outperforms state-of-the-art baselines across diverse domains.