1 paper across 1 session
We demonstrate that using reverse Kullback-Leibler loss with the log-derivative trick for training diffusion bridges outperforms the commonly used Log Variance loss, providing better results and more stable training.