Abstract
Mainstream computational lexical semantics embraces the assumption that word senses can be represented as discrete items of a predefined inventory. In this paper we show this needs not be the case, and propose a unified model that is able to produce contextually appropriate definitions. In our model, Generationary, we employ a novel span-based encoding scheme which we use to fine-tune an English pre-trained Encoder-Decoder system to generate glosses. We show that, even though we drop the need of choosing from a predefined sense inventory, our model can be employed effectively: not only does Generationary outperform previous approaches in the generative task of Definition Modeling in many settings, but it also matches or surpasses the state of the art in discriminative tasks such as Word Sense Disambiguation and Word-in-Context. Finally, we show that Generationary benefits from training on data from multiple inventories, with strong gains on various zero-shot benchmarks, including a novel dataset of definitions for free adjective-noun phrases. The software and reproduction materials are available at http://generationary.org.- Anthology ID:
- 2020.emnlp-main.585
- Volume:
- Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Editors:
- Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 7207–7221
- Language:
- URL:
- https://aclanthology.org/2020.emnlp-main.585
- DOI:
- 10.18653/v1/2020.emnlp-main.585
- Cite (ACL):
- Michele Bevilacqua, Marco Maru, and Roberto Navigli. 2020. Generationary or “How We Went beyond Word Sense Inventories and Learned to Gloss”. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 7207–7221, Online. Association for Computational Linguistics.
- Cite (Informal):
- Generationary or “How We Went beyond Word Sense Inventories and Learned to Gloss” (Bevilacqua et al., EMNLP 2020)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2020.emnlp-main.585.pdf