Poster Session 5 · Friday, December 5, 2025 11:00 AM → 2:00 PM
#2807 Spotlight
Universal Sequence Preconditioning
Abstract
We study the problem of preconditioning in the setting of sequential prediction. From the theoretical lens of linear dynamical systems, we show that applying a convolution to the input sequence translates to applying a polynomial to the unknown transition matrix in the hidden space.
With this insight, we develop a novel preconditioning method that convolves the input sequence with the coefficients of the Chebyshev or Legendre polynomials. We formally prove that this improves the regret of a wide family of prediction methods.
We proceed to apply this preconditioning technique to the method of spectral filtering. This gives the first sublinear regret bound that is also hidden-dimension free (up to logarithmic factors) even when the hidden transition matrix is asymmetric.
From rigorous experiments on synthetic data we show that our simple preconditioning method generalizes to both:
- settings where the data is not from a linear dynamical system and
- a broad range of learning algorithms, including recurrent neural networks.