DrFrattn: Directly Learn Adaptive Policy from Attention for Simultaneous Machine Translation

Libo Zhao, Jing Li, Ziqian Zeng


Abstract
Simultaneous machine translation (SiMT) necessitates a robust read/write (R/W) policy to determine the optimal moments for translation, thereby balancing translation quality and latency. Effective timing in translation can align source and target tokens accurately. The attention mechanism within translation models inherently provides valuable alignment information. Building on this, previous research has attempted to modify the attention mechanism’s structure to leverage its alignment properties during training, employing multi-task learning to derive the read/write policy. However, this multi-task learning approach may compromise the efficacy of the attention mechanism itself. This raises a natural question: why not directly learn the read/write policy from the well-trained attention mechanism? In this study, we propose DrFrattn, a method that directly learns adaptive policies from the attention mechanism. Experimental results across various benchmarks demonstrate that our approach achieves an improved balance between translation accuracy and latency.
Anthology ID:
2025.emnlp-main.1767
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
34881–34894
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1767/
DOI:
Bibkey:
Cite (ACL):
Libo Zhao, Jing Li, and Ziqian Zeng. 2025. DrFrattn: Directly Learn Adaptive Policy from Attention for Simultaneous Machine Translation. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 34881–34894, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
DrFrattn: Directly Learn Adaptive Policy from Attention for Simultaneous Machine Translation (Zhao et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1767.pdf
Checklist:
 2025.emnlp-main.1767.checklist.pdf