Cross-MoE: An Efficient Temporal Prediction Framework Integrating Textual Modality

Ruizheng Huang, Zhicheng Zhang, Yong Wang


Abstract
It has been demonstrated that incorporating external information as textual modality can effectively improve time series forecasting accuracy. However, current multi-modal models ignore the dynamic and different relations between time series patterns and textual features, which leads to poor performance in temporal-textual feature fusion. In this paper, we propose a lightweight and model-agnostic temporal-textual fusion framework named Cross-MoE. It replaces Cross Attention with Cross-Ranker to reduce computational complexity, and enhances modality-aware correlation memorization with Mixture-of-Experts (MoE) networks to tolerate the distributional shifts in time series. The experimental results demonstrate a 8.78% average reduction in Mean Squared Error (MSE) compared to the SOTA multi-modal time series framework. Notably, our method requires only 75% of computational overhead and 12.5% of activated parameters compared with Cross Attention mechanism. Our codes are available at https://github.com/Kilosigh/Cross-MoE.git
Anthology ID:
2025.emnlp-main.1520
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
29915–29926
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1520/
DOI:
Bibkey:
Cite (ACL):
Ruizheng Huang, Zhicheng Zhang, and Yong Wang. 2025. Cross-MoE: An Efficient Temporal Prediction Framework Integrating Textual Modality. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 29915–29926, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Cross-MoE: An Efficient Temporal Prediction Framework Integrating Textual Modality (Huang et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1520.pdf
Checklist:
 2025.emnlp-main.1520.checklist.pdf