Character-Level Translation with Self-attention
Yingqiang Gao, Nikola I. Nikolov, Yuhuang Hu, Richard H.R. Hahnloser
Abstract
We explore the suitability of self-attention models for character-level neural machine translation. We test the standard transformer model, as well as a novel variant in which the encoder block combines information from nearby characters using convolutions. We perform extensive experiments on WMT and UN datasets, testing both bilingual and multilingual translation to English using up to three input languages (French, Spanish, and Chinese). Our transformer variant consistently outperforms the standard transformer at the character-level and converges faster while learning more robust character-level alignments.- Anthology ID:
- 2020.acl-main.145
- Volume:
- Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
- Month:
- July
- Year:
- 2020
- Address:
- Online
- Editors:
- Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1591–1604
- Language:
- URL:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2020.acl-main.145/
- DOI:
- 10.18653/v1/2020.acl-main.145
- Cite (ACL):
- Yingqiang Gao, Nikola I. Nikolov, Yuhuang Hu, and Richard H.R. Hahnloser. 2020. Character-Level Translation with Self-attention. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 1591–1604, Online. Association for Computational Linguistics.
- Cite (Informal):
- Character-Level Translation with Self-attention (Gao et al., ACL 2020)
- PDF:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2020.acl-main.145.pdf