Lattice-Based Transformer Encoder for Neural Machine Translation

Fengshun Xiao, Jiangtong Li, Hai Zhao, Rui Wang, Kehai Chen

[How to correct problems with metadata yourself]


Abstract
Neural machine translation (NMT) takes deterministic sequences for source representations. However, either word-level or subword-level segmentations have multiple choices to split a source sequence with different word segmentors or different subword vocabulary sizes. We hypothesize that the diversity in segmentations may affect the NMT performance. To integrate different segmentations with the state-of-the-art NMT model, Transformer, we propose lattice-based encoders to explore effective word or subword representation in an automatic way during training. We propose two methods: 1) lattice positional encoding and 2) lattice-aware self-attention. These two methods can be used together and show complementary to each other to further improve translation performance. Experiment results show superiorities of lattice-based encoders in word-level and subword-level representations over conventional Transformer encoder.
Anthology ID:
P19-1298
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3090–3097
Language:
URL:
https://aclanthology.org/P19-1298
DOI:
10.18653/v1/P19-1298
Bibkey:
Cite (ACL):
Fengshun Xiao, Jiangtong Li, Hai Zhao, Rui Wang, and Kehai Chen. 2019. Lattice-Based Transformer Encoder for Neural Machine Translation. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3090–3097, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Lattice-Based Transformer Encoder for Neural Machine Translation (Xiao et al., ACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/teach-a-man-to-fish/P19-1298.pdf