Query-Key Normalization for Transformers

Alex Henry, Prudhvi Raj Dachapally, Shubham Shantaram Pawar, Yuxuan Chen


Abstract
Low-resource language translation is a challenging but socially valuable NLP task. Building on recent work adapting the Transformer’s normalization to this setting, we propose QKNorm, a normalization technique that modifies the attention mechanism to make the softmax function less prone to arbitrary saturation without sacrificing expressivity. Specifically, we apply l2-normalization along the head dimension of each query and key matrix prior to multiplying them and then scale up by a learnable parameter instead of dividing by the square root of the embedding dimension. We show improvements averaging 0.928 BLEU over state-of-the-art bilingual benchmarks for 5 low-resource translation pairs from the TED Talks corpus and IWSLT’15.
Anthology ID:
2020.findings-emnlp.379
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4246–4253
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2020.findings-emnlp.379/
DOI:
10.18653/v1/2020.findings-emnlp.379
Bibkey:
Cite (ACL):
Alex Henry, Prudhvi Raj Dachapally, Shubham Shantaram Pawar, and Yuxuan Chen. 2020. Query-Key Normalization for Transformers. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 4246–4253, Online. Association for Computational Linguistics.
Cite (Informal):
Query-Key Normalization for Transformers (Henry et al., Findings 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2020.findings-emnlp.379.pdf
Optionalsupplementarymaterial:
 2020.findings-emnlp.379.OptionalSupplementaryMaterial.zip
Code
 CyndxAI/QKNorm +  additional community code