3 papers across 3 sessions
We propose Permutation Equivariant Graph Neural CDEs, an equivariant and parameter-efficient extension of Graph Neural CDEs for dynamic graph representation learning.
We introduce Set-LLM, a permutation-invariant LLM architecture that eliminates order bias and sensitivity.
We introduce a structure-based generative model for sampling protein conformations efficiently and eliminate the need for pre-training and evolutionary information.