The Finer They Get: Combining Fine-Tuned Models For Better Semantic Change Detection

Wei Zhou, Nina Tahmasebi, Haim Dubossarsky


Abstract
In this work we investigate the hypothesis that enriching contextualized models using fine-tuning tasks can improve theircapacity to detect lexical semantic change (LSC). We include tasks aimed to capture both low-level linguistic information like part-of-speech tagging, as well as higher level (semantic) information. Through a series of analyses we demonstrate that certain combinations of fine-tuning tasks, like sentiment, syntactic information, and logical inference, bring large improvements to standard LSC models that are based only on standard language modeling. We test on the binary classification and ranking tasks of SemEval-2020 Task 1 and evaluate using both permutation tests and under transfer-learningscenarios.
Anthology ID:
2023.nodalida-1.52
Volume:
Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa)
Month:
May
Year:
2023
Address:
Tórshavn, Faroe Islands
Editors:
Tanel Alumäe, Mark Fishel
Venue:
NoDaLiDa
SIG:
Publisher:
University of Tartu Library
Note:
Pages:
518–528
Language:
URL:
https://aclanthology.org/2023.nodalida-1.52
DOI:
Bibkey:
Cite (ACL):
Wei Zhou, Nina Tahmasebi, and Haim Dubossarsky. 2023. The Finer They Get: Combining Fine-Tuned Models For Better Semantic Change Detection. In Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa), pages 518–528, Tórshavn, Faroe Islands. University of Tartu Library.
Cite (Informal):
The Finer They Get: Combining Fine-Tuned Models For Better Semantic Change Detection (Zhou et al., NoDaLiDa 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/2023.nodalida-1.52.pdf