RingFormer: Rethinking Recurrent Transformer with Adaptive Level Signals

Jaemu Heo, Eldor Fozilov, Hyunmin Song, Taehwan Kim


Abstract
Transformers have achieved great success in effectively processing sequential data such as text. Their architecture consisting of several attention and feedforward blocks can model relations between elements of a sequence in parallel manner, which makes them very efficient to train and effective in sequence modeling. Even though they have shown strong performance in processing sequential data, the size of their parameters is considerably larger when compared to other architectures such as RNN and CNN based models. Therefore, several approaches have explored parameter sharing and recurrence in Transformer models to address their computational demands. However, such methods struggle to maintain high performance compared to the original transformer model. To address this challenge, we propose our novel approach, RingFormer, which employs one Transformer layer that processes input repeatedly in a circular, ring-like manner, while utilizing low-rank matrices to generate input-dependent level signals. This allows us to reduce the model parameters substantially while maintaining high performance in a variety of tasks such as translation and image classification, as validated in the experiments.
Anthology ID:
2025.findings-emnlp.1182
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
21675–21686
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1182/
DOI:
10.18653/v1/2025.findings-emnlp.1182
Bibkey:
Cite (ACL):
Jaemu Heo, Eldor Fozilov, Hyunmin Song, and Taehwan Kim. 2025. RingFormer: Rethinking Recurrent Transformer with Adaptive Level Signals. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 21675–21686, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
RingFormer: Rethinking Recurrent Transformer with Adaptive Level Signals (Heo et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1182.pdf
Checklist:
 2025.findings-emnlp.1182.checklist.pdf