Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ml 80% Match Theoretical and Empirical Research Paper Machine Learning Researchers,Time Series Analysts,Control Engineers,Data Scientists 20 hours ago

Universal Sequence Preconditioning

generative-ai › autoregressive
📄 Abstract

Abstract: We study the problem of preconditioning in sequential prediction. From the theoretical lens of linear dynamical systems, we show that convolving the target sequence corresponds to applying a polynomial to the hidden transition matrix. Building on this insight, we propose a universal preconditioning method that convolves the target with coefficients from orthogonal polynomials such as Chebyshev or Legendre. We prove that this approach reduces regret for two distinct prediction algorithms and yields the first ever sublinear and hidden-dimension-independent regret bounds (up to logarithmic factors) that hold for systems with marginally table and asymmetric transition matrices. Finally, extensive synthetic and real-world experiments show that this simple preconditioning strategy improves the performance of a diverse range of algorithms, including recurrent neural networks, and generalizes to signals beyond linear dynamical systems.

Key Contributions

Proposes a universal preconditioning method for sequential prediction by convolving the target sequence with coefficients from orthogonal polynomials (e.g., Chebyshev, Legendre). This approach, grounded in linear dynamical systems theory, achieves the first sublinear and hidden-dimension-independent regret bounds for certain challenging systems and improves performance across diverse algorithms like RNNs.

Business Value

Leads to more reliable and efficient prediction systems, crucial for applications like financial forecasting, resource management, and control systems, by providing stronger theoretical guarantees and empirical improvements.