2 papers across 2 sessions
This work proposes a Bayesian framework with variational inference that adapts the prior and posterior to covariate shifts, improving uncertainty estimates by capturing predictive reliability rather than just input dissimilarity.
We demonstrate that using reverse Kullback-Leibler loss with the log-derivative trick for training diffusion bridges outperforms the commonly used Log Variance loss, providing better results and more stable training.