TranSFormer: Slow-Fast Transformer for Machine Translation

Bei Li, Yi Jing, Xu Tan, Zhen Xing, Tong Xiao, Jingbo Zhu


Abstract
Learning multiscale Transformer models has been evidenced as a viable approach to augmenting machine translation systems. Prior research has primarily focused on treating subwords as basic units in developing such systems. However, the incorporation of fine-grained character-level features into multiscale Transformer has not yet been explored. In this work, we present a Slow-Fast two-stream learning model, referred to as TranSFormer, which utilizes a “slow” branch to deal with subword sequences and a “fast” branch to deal with longer character sequences. This model is efficient since the fast branch is very lightweight by reducing the model width, and yet provides useful fine-grained features for the slow branch. Our TranSFormer shows consistent BLEU improvements (larger than 1 BLEU point) on several machine translation benchmarks.
Anthology ID:
2023.findings-acl.430
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6883–6896
Language:
URL:
https://aclanthology.org/2023.findings-acl.430
DOI:
10.18653/v1/2023.findings-acl.430
Bibkey:
Cite (ACL):
Bei Li, Yi Jing, Xu Tan, Zhen Xing, Tong Xiao, and Jingbo Zhu. 2023. TranSFormer: Slow-Fast Transformer for Machine Translation. In Findings of the Association for Computational Linguistics: ACL 2023, pages 6883–6896, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
TranSFormer: Slow-Fast Transformer for Machine Translation (Li et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2023.findings-acl.430.pdf