LLM-Mixer: Multiscale Mixing in LLMs for Time Series Forecasting

Md Kowsher, Md. Shohanur Islam Sobuj, Nusrat Jahan Prottasha, E. Alejandro Alanis, Ozlem Garibay, Niloofar Yousefi


Abstract
Time series forecasting is a challenging task, especially when dealing with data that contains both short-term variations and long-term trends. In this study, we introduce LLM-Mixer, a novel framework that combines multiscale time-series decomposition with the power of pre-trained Large Language Models (LLMs). LLM-Mixer breaks down time-series data into multiple temporal resolutions using downsampling and processes these multiscale representations with a frozen LLM, guided by a carefully designed text prompt that encodes information about the dataset’s features and structure. To understand the role of downsampling, we conduct a detailed analysis using Neural Tangent Kernel (NTK) distance, showing that incorporating multiple scales improves the model’s learning dynamics.We evaluate LLM-Mixer across a diverse set of forecasting tasks, including long-term multivariate, short-term multivariate, and long-term univariate scenarios. Experimental results demonstrate that LLM-Mixer achieves competitive performance compared to recent state-of-the-art models across various forecasting horizons.
Anthology ID:
2025.trl-1.12
Volume:
Proceedings of the 4th Table Representation Learning Workshop
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Shuaichen Chang, Madelon Hulsebos, Qian Liu, Wenhu Chen, Huan Sun
Venues:
TRL | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
156–165
Language:
URL:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.trl-1.12/
DOI:
Bibkey:
Cite (ACL):
Md Kowsher, Md. Shohanur Islam Sobuj, Nusrat Jahan Prottasha, E. Alejandro Alanis, Ozlem Garibay, and Niloofar Yousefi. 2025. LLM-Mixer: Multiscale Mixing in LLMs for Time Series Forecasting. In Proceedings of the 4th Table Representation Learning Workshop, pages 156–165, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
LLM-Mixer: Multiscale Mixing in LLMs for Time Series Forecasting (Kowsher et al., TRL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.trl-1.12.pdf