Modeling Concentrated Cross-Attention for Neural Machine Translation with Gaussian Mixture Model

Shaolei Zhang, Yang Feng


Abstract
Cross-attention is an important component of neural machine translation (NMT), which is always realized by dot-product attention in previous methods. However, dot-product attention only considers the pair-wise correlation between words, resulting in dispersion when dealing with long sentences and neglect of source neighboring relationships. Inspired by linguistics, the above issues are caused by ignoring a type of cross-attention, called concentrated attention, which focuses on several central words and then spreads around them. In this work, we apply Gaussian Mixture Model (GMM) to model the concentrated attention in cross-attention. Experiments and analyses we conducted on three datasets show that the proposed method outperforms the baseline and has significant improvement on alignment quality, N-gram accuracy, and long sentence translation.
Anthology ID:
2021.findings-emnlp.121
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1401–1411
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.121
DOI:
10.18653/v1/2021.findings-emnlp.121
Bibkey:
Cite (ACL):
Shaolei Zhang and Yang Feng. 2021. Modeling Concentrated Cross-Attention for Neural Machine Translation with Gaussian Mixture Model. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 1401–1411, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Modeling Concentrated Cross-Attention for Neural Machine Translation with Gaussian Mixture Model (Zhang & Feng, Findings 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2021.findings-emnlp.121.pdf
Video:
 https://preview.aclanthology.org/naacl24-info/2021.findings-emnlp.121.mp4