Neural Temporality Adaptation for Document Classification: Diachronic Word Embeddings and Domain Adaptation Models

Xiaolei Huang, Michael J. Paul


Abstract
Language usage can change across periods of time, but document classifiers models are usually trained and tested on corpora spanning multiple years without considering temporal variations. This paper describes two complementary ways to adapt classifiers to shifts across time. First, we show that diachronic word embeddings, which were originally developed to study language change, can also improve document classification, and we show a simple method for constructing this type of embedding. Second, we propose a time-driven neural classification model inspired by methods for domain adaptation. Experiments on six corpora show how these methods can make classifiers more robust over time.
Anthology ID:
P19-1403
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4113–4123
Language:
URL:
https://aclanthology.org/P19-1403
DOI:
10.18653/v1/P19-1403
Bibkey:
Cite (ACL):
Xiaolei Huang and Michael J. Paul. 2019. Neural Temporality Adaptation for Document Classification: Diachronic Word Embeddings and Domain Adaptation Models. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 4113–4123, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Neural Temporality Adaptation for Document Classification: Diachronic Word Embeddings and Domain Adaptation Models (Huang & Paul, ACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/P19-1403.pdf
Code
 xiaoleihuang/Neural_Temporality_Adaptation