Assistant Professor, Swiss Federal Institute of Technology Lausanne
1 paper at NeurIPS 2025
We propose LION, a framework for extending Linear Transformers to the bidirectional setting by providing three theoretically equivalent representations: full attention, bidirectional RNN, and chunkwise parallel form.