2 papers across 2 sessions
PRSformer: A scalable Transformer using neighborhood attention for multitask disease prediction from million-scale individual genotypes, showing non-linear modeling benefits at large N.
We generalize attention and neighborhood attention to the two-dimensional sphere in an equivariant manner.