3 papers across 3 sessions
We propose LION, a framework for extending Linear Transformers to the bidirectional setting by providing three theoretically equivalent representations: full attention, bidirectional RNN, and chunkwise parallel form.
We introduce the Fixed-Point RNN framework to solve state-tracking tasks by parameterizing the state transition matrix as implicitly dense.
We introduce WaLRUS, a wavelet-based SSM leveraging SaFARi for improved accuracy and stability in modeling non-smooth, transient signals, outperforming traditional HiPPO-based models.