4 papers across 3 sessions
A hybrid quantum-classical Transformer model with quantum-induced doubly-stochastic attention that stabilizes and improves small-scale vision transformers
A new framework for efficient gradient estimation using the Lie algebraic structures and the Hadamard test.