A Dynamic Head Importance Computation Mechanism for Neural Machine Translation

Akshay Goindani, Manish Shrivastava


Abstract
Multiple parallel attention mechanisms that use multiple attention heads facilitate greater performance of the Transformer model for various applications e.g., Neural Machine Translation (NMT), text classification. In multi-head attention mechanism, different heads attend to different parts of the input. However, the limitation is that multiple heads might attend to the same part of the input, resulting in multiple heads being redundant. Thus, the model resources are under-utilized. One approach to avoid this is to prune least important heads based on certain importance score. In this work, we focus on designing a Dynamic Head Importance Computation Mechanism (DHICM) to dynamically calculate the importance of a head with respect to the input. Our insight is to design an additional attention layer together with multi-head attention, and utilize the outputs of the multi-head attention along with the input, to compute the importance for each head. Additionally, we add an extra loss function to prevent the model from assigning same score to all heads, to identify more important heads and improvise performance. We analyzed performance of DHICM for NMT with different languages. Experiments on different datasets show that DHICM outperforms traditional Transformer-based approach by large margin, especially, when less training data is available.
Anthology ID:
2021.ranlp-1.52
Volume:
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021)
Month:
September
Year:
2021
Address:
Held Online
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
454–462
Language:
URL:
https://aclanthology.org/2021.ranlp-1.52
DOI:
Bibkey:
Cite (ACL):
Akshay Goindani and Manish Shrivastava. 2021. A Dynamic Head Importance Computation Mechanism for Neural Machine Translation. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021), pages 454–462, Held Online. INCOMA Ltd..
Cite (Informal):
A Dynamic Head Importance Computation Mechanism for Neural Machine Translation (Goindani & Shrivastava, RANLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-dup-bibkey/2021.ranlp-1.52.pdf