Postdoc, Columbia University
2 papers at NeurIPS 2025
We present POSSM, a novel architecture that combines input cross-attention with a recurrent state-space model, achieving competitive accuracy, fast inference, and efficient generalization for real-time neural decoding applications.
We present NuCLR, a self-supervised framework that learns high-quality, population-aware neuron-level embeddings directly from spike train data using a spatio-temporal transformer and tailored contrastive loss.