Passing Parser Uncertainty to the Transformer: Labeled Dependency Distributions for Neural Machine Translation

Dongqi Pu, Khalil Sima’an


Abstract
Existing syntax-enriched neural machine translation (NMT) models work either with the single most-likely unlabeled parse or the set of n-best unlabeled parses coming out of an external parser. Passing a single or n-best parses to the NMT model risks propagating parse errors. Furthermore, unlabeled parses represent only syntactic groupings without their linguistically relevant categories. In this paper we explore the question: Does passing both parser uncertainty and labeled syntactic knowledge to the Transformer improve its translation performance? This paper contributes a novel method for infusing the whole labeled dependency distributions (LDD) of the source sentence’s dependency forest into the self-attention mechanism of the encoder of the Transformer. A range of experimental results on three language pairs demonstrate that the proposed approach outperforms both the vanilla Transformer as well as the single best-parse Transformer model across several evaluation metrics.
Anthology ID:
2022.eamt-1.7
Volume:
Proceedings of the 23rd Annual Conference of the European Association for Machine Translation
Month:
June
Year:
2022
Address:
Ghent, Belgium
Venue:
EAMT
SIG:
Publisher:
European Association for Machine Translation
Note:
Pages:
41–50
Language:
URL:
https://aclanthology.org/2022.eamt-1.7
DOI:
Bibkey:
Cite (ACL):
Dongqi Pu and Khalil Sima’an. 2022. Passing Parser Uncertainty to the Transformer: Labeled Dependency Distributions for Neural Machine Translation. In Proceedings of the 23rd Annual Conference of the European Association for Machine Translation, pages 41–50, Ghent, Belgium. European Association for Machine Translation.
Cite (Informal):
Passing Parser Uncertainty to the Transformer: Labeled Dependency Distributions for Neural Machine Translation (Pu & Sima’an, EAMT 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2022.eamt-1.7.pdf