5 papers across 3 sessions
Transformers fall short of surpassing the performance ceiling of GNNs, but introducing appropriate constraints can effectively enhance the generalization of GNNs.
We propose a new graph transformer that introduces a novel token swapping operation to generate diverse token sequences to further enhance model performance.
We propose a Graph Transformer on pseudo-Riemannian manifolds
We treat substructures as graph tokens to enable the scale transformer pre-training on graphs.