Postdoc, Mila - Quebec Artificial Intelligence Institute
4 papers at NeurIPS 2025
We introduce the challenge of adaptive inference-time scaling—dynamically adjusting computational effort during inference—and propose Adaptive Bi-directional Cyclic Diffusion (ABCD), a flexible, search-based inference framework.
We propose a scalable and sample-efficient framework for training diffusion samplers by integrating classical sampling methods, suitable for practical applications like molecular conformer generation.
Energy-based training of neural continuous-time Markov processes in general state space.
We improve the speed and performance of LLM post-training via a new asynchronous RL approach, leveraging an off-policy objective, replay buffer, and sampling strategies.