Abstract
Position encoding (PE), an essential part of self-attention networks (SANs), is used to preserve the word order information for natural language processing tasks, generating fixed position indices for input sequences. However, in cross-lingual scenarios, machine translation, the PEs of source and target sentences are modeled independently. Due to word order divergences in different languages, modeling the cross-lingual positional relationships might help SANs tackle this problem. In this paper, we augment SANs with cross-lingual position representations to model the bilingually aware latent structure for the input sentence. Specifically, we utilize bracketing transduction grammar (BTG)-based reordering information to encourage SANs to learn bilingual diagonal alignments. Experimental results on WMT’14 English⇒German, WAT’17 Japanese⇒English, and WMT’17 Chinese⇔English translation tasks demonstrate that our approach significantly and consistently improves translation quality over strong baselines. Extensive analyses confirm that the performance gains come from the cross-lingual information.- Anthology ID:
- 2020.acl-main.153
- Volume:
- Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
- Month:
- July
- Year:
- 2020
- Address:
- Online
- Editors:
- Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1679–1685
- Language:
- URL:
- https://aclanthology.org/2020.acl-main.153
- DOI:
- 10.18653/v1/2020.acl-main.153
- Cite (ACL):
- Liang Ding, Longyue Wang, and Dacheng Tao. 2020. Self-Attention with Cross-Lingual Position Representation. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 1679–1685, Online. Association for Computational Linguistics.
- Cite (Informal):
- Self-Attention with Cross-Lingual Position Representation (Ding et al., ACL 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2020.acl-main.153.pdf