Temporal Attention for Language Models

Guy D. Rosin, Kira Radinsky


Abstract
Pretrained language models based on the transformer architecture have shown great success in NLP.Textual training data often comes from the web and is thus tagged with time-specific information, but most language models ignore this information. They are trained on the textual data alone, limiting their ability to generalize temporally. In this work, we extend the key component of the transformer architecture, i.e., the self-attention mechanism, and propose temporal attention - a time-aware self-attention mechanism. Temporal attention can be applied to any transformer model and requires the input texts to be accompanied with their relevant time points. This mechanism allows the transformer to capture this temporal information and create time-specific contextualized word representations. We leverage these representations for the task of semantic change detection; we apply our proposed mechanism to BERT and experiment on three datasets in different languages (English, German, and Latin) that also vary in time, size, and genre. Our proposed model achieves state-of-the-art results on all the datasets.
Anthology ID:
2022.findings-naacl.112
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1498–1508
Language:
URL:
https://aclanthology.org/2022.findings-naacl.112
DOI:
10.18653/v1/2022.findings-naacl.112
Bibkey:
Cite (ACL):
Guy D. Rosin and Kira Radinsky. 2022. Temporal Attention for Language Models. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 1498–1508, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Temporal Attention for Language Models (Rosin & Radinsky, Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2022.findings-naacl.112.pdf
Software:
 2022.findings-naacl.112.software.tgz
Video:
 https://preview.aclanthology.org/nschneid-patch-4/2022.findings-naacl.112.mp4
Code
 guyrosin/temporal_attention