Quasi Bidirectional Encoder Representations from Transformers for Word Sense Disambiguation

Michele Bevilacqua, Roberto Navigli


Abstract
While contextualized embeddings have produced performance breakthroughs in many Natural Language Processing (NLP) tasks, Word Sense Disambiguation (WSD) has not benefited from them yet. In this paper, we introduce QBERT, a Transformer-based architecture for contextualized embeddings which makes use of a co-attentive layer to produce more deeply bidirectional representations, better-fitting for the WSD task. As a result, we are able to train a WSD system that beats the state of the art on the concatenation of all evaluation datasets by over 3 points, also outperforming a comparable model using ELMo.
Anthology ID:
R19-1015
Volume:
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019)
Month:
September
Year:
2019
Address:
Varna, Bulgaria
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
122–131
Language:
URL:
https://aclanthology.org/R19-1015
DOI:
10.26615/978-954-452-056-4_015
Bibkey:
Cite (ACL):
Michele Bevilacqua and Roberto Navigli. 2019. Quasi Bidirectional Encoder Representations from Transformers for Word Sense Disambiguation. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019), pages 122–131, Varna, Bulgaria. INCOMA Ltd..
Cite (Informal):
Quasi Bidirectional Encoder Representations from Transformers for Word Sense Disambiguation (Bevilacqua & Navigli, RANLP 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-dup-bibkey/R19-1015.pdf