2 papers across 1 session
Evaluating real-world einsum expressions can lead to hyper-sparse intermediate tensors that are hard to predict, we exploit this dynamic sparsity in a hybrid algorithm.
We introduce Distance-informed Neural Processes (DNP), a Neural Process variant with bi-Lipschitz regularization that preserves input geometry. DNP yields better-calibrated uncertainty & stronger OOD detection across regression + classification.