WIKITIDE: A Wikipedia-Based Timestamped Definition Pairs Dataset

Hsuvas Borkakoty, Luis Espinosa Anke


Abstract
A fundamental challenge in the current NLP context, dominated by language models, comes from the inflexibility of current architectures to “learn” new information. While model-centric solutions like continual learning or parameter-efficient fine-tuning are available, the question still remains of how to reliably identify changes in language or in the world. In this paper, we propose WikiTiDe, a dataset derived from pairs of timestamped definitions extracted from Wikipedia. We argue that such resources can be helpful for accelerating diachronic NLP, specifically, for training models able to scan knowledge resources for core updates concerning a concept, an event, or a named entity. Our proposed end-to-end method is fully automatic and leverages a bootstrapping algorithm for gradually creating a high-quality dataset. Our results suggest that bootstrapping the seed version of WikiTiDe leads to better-fine-tuned models. We also leverage fine-tuned models in a number of downstream tasks, showing promising results with respect to competitive baselines.
Anthology ID:
2023.ranlp-1.23
Volume:
Proceedings of the 14th International Conference on Recent Advances in Natural Language Processing
Month:
September
Year:
2023
Address:
Varna, Bulgaria
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd., Shoumen, Bulgaria
Note:
Pages:
207–216
Language:
URL:
https://aclanthology.org/2023.ranlp-1.23
DOI:
Bibkey:
Cite (ACL):
Hsuvas Borkakoty and Luis Espinosa Anke. 2023. WIKITIDE: A Wikipedia-Based Timestamped Definition Pairs Dataset. In Proceedings of the 14th International Conference on Recent Advances in Natural Language Processing, pages 207–216, Varna, Bulgaria. INCOMA Ltd., Shoumen, Bulgaria.
Cite (Informal):
WIKITIDE: A Wikipedia-Based Timestamped Definition Pairs Dataset (Borkakoty & Espinosa Anke, RANLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2023.ranlp-1.23.pdf