6 papers across 3 sessions
We present a principled recipe for building graph foundation models that generalize across arbitrary graphs, features, and label spaces.
We learn non-gradient field dynamics by solving Schrödinger Bridge problem with non-zero reference process drift
we train transferable normalizing flows to sample from peptide Boltzmann distributions up to 8 residues
We propose a diffusion-based approach to sampling from Boltzmann densities based on temperature annealing.
We show the effects of vanishing gradients on GNNs.
We formalize the over-squashing phenomenon in spatiotemporal graph neural networks and analyze how it affects information propagation across the spatial and temporal dimensions.