3 papers across 2 sessions
PRSformer: A scalable Transformer using neighborhood attention for multitask disease prediction from million-scale individual genotypes, showing non-linear modeling benefits at large N.
We propose a novel transfer learning framework for regression where outputs are probability distributions residing in the Wasserstein space.
We propose a model that learns per-sample routing for multimodal multitask prediction, improving accuracy and interpretability on heterogeneous data.