Circuit Complexity Bounds for RoPE-based Transformer Architecture

Bo Chen, Xiaoyu Li, Yingyu Liang, Jiangxuan Long, Zhenmei Shi, Zhao Song, Jiahao Zhang


Abstract
Characterizing the expressive power of the Transformer architecture is critical to understanding its capacity limits and scaling law. Recent works provide the circuit complexity bounds to Transformer-like architecture. On the other hand, position embedding has emerged as a crucial technique in modern large language models, offering superior performance in capturing positional information, which shows great performance for the long context scenario. In this work, we take a circuit complexity perspective and rigorously analyze Transformers augmented with widely adopted positional embeddings. We prove that, under standard complexity assumptions, such models remain incapable of efficiently solving canonical tasks such as arithmetic formula evaluation and Boolean formula value computation. Our results expose a fundamental expressivity limitation that persists despite the remarkable empirical success of positionally-enhanced Transformers. Beyond tightening known complexity bounds, our findings offer new theoretical insights for designing future architectures with provably stronger reasoning and compositional capabilities.
Anthology ID:
2025.emnlp-main.561
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11091–11108
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.561/
DOI:
Bibkey:
Cite (ACL):
Bo Chen, Xiaoyu Li, Yingyu Liang, Jiangxuan Long, Zhenmei Shi, Zhao Song, and Jiahao Zhang. 2025. Circuit Complexity Bounds for RoPE-based Transformer Architecture. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 11091–11108, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Circuit Complexity Bounds for RoPE-based Transformer Architecture (Chen et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.561.pdf
Checklist:
 2025.emnlp-main.561.checklist.pdf