Alleviating the Inequality of Attention Heads for Neural Machine Translation

Zewei Sun, Shujian Huang, Xinyu Dai, Jiajun Chen


Abstract
Recent studies show that the attention heads in Transformer are not equal. We relate this phenomenon to the imbalance training of multi-head attention and the model dependence on specific heads. To tackle this problem, we propose a simple masking method: HeadMask, in two specific ways. Experiments show that translation improvements are achieved on multiple language pairs. Subsequent empirical analyses also support our assumption and confirm the effectiveness of the method.
Anthology ID:
2022.coling-1.466
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5246–5250
Language:
URL:
https://aclanthology.org/2022.coling-1.466
DOI:
Bibkey:
Cite (ACL):
Zewei Sun, Shujian Huang, Xinyu Dai, and Jiajun Chen. 2022. Alleviating the Inequality of Attention Heads for Neural Machine Translation. In Proceedings of the 29th International Conference on Computational Linguistics, pages 5246–5250, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Alleviating the Inequality of Attention Heads for Neural Machine Translation (Sun et al., COLING 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.coling-1.466.pdf
Data
IWSLT2015WMT 2016WMT 2016 News