Time-LlaMA: Adapting Large Language Models for Time Series Modeling via Dynamic Low-rank Adaptation

Juyuan Zhang, Jiechao Gao, Wenwen Ouyang, Wei Zhu, Hui Yi Leong


Abstract
Time series modeling holds significant importance in many industrial applications and has been extensively studied. A series of recent studies have demonstrated that large language models (LLMs) possess robust pattern recognition and semantic understanding capabilities over time series data. However, the current literature have yet striked a high-quality balance between (a) effectively aligning the time series and natural language modalities and (b) keeping the inference efficiency for industrial deployment. To address the above issues, we now propose the Time-LlaMA framework. Time-LlaMA first converts the time series input into token embeddings through a linear tokenization mechanism. Second, the time series token embeddings are aligned with the text prompts. Third, to further adapt the large languag model (LLM) backbone for time series modeling, we have developed a dynamic low-rank adaptation technique (DynaLoRA). DynaLoRA dynamically chooses the most suitable LoRA modules at each layer of the Transformer backbone for each time series input, enhancing the model’s predictive capabilities. Our experimental results on an extensive collection of challenging open and proprietary time series tasks confirm that our proposed method achieves the state-of-the-art (SOTA) performance and have potentials for wide industrial usages.
Anthology ID:
2025.acl-srw.90
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Jin Zhao, Mingyang Wang, Zhu Liu
Venues:
ACL | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1145–1157
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.acl-srw.90/
DOI:
Bibkey:
Cite (ACL):
Juyuan Zhang, Jiechao Gao, Wenwen Ouyang, Wei Zhu, and Hui Yi Leong. 2025. Time-LlaMA: Adapting Large Language Models for Time Series Modeling via Dynamic Low-rank Adaptation. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop), pages 1145–1157, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Time-LlaMA: Adapting Large Language Models for Time Series Modeling via Dynamic Low-rank Adaptation (Zhang et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.acl-srw.90.pdf