Postdoc, The University of Adelaide
3 papers at NeurIPS 2025
Temporal Generative Flow Networks (Temporal GFNs) offer a novel approach to probabilistic time series forecasting by adapting GFN principles for continuous data.
We theoretically formalize real-world misalignment in multimodal learning via latent-variable modeling, showing that learned representations inherently encode semantics invariant to selection and perturbation biases.