The Importance of Context in Very Low Resource Language Modeling

Lukas Edman, Antonio Toral, Gertjan van Noord


Abstract
This paper investigates very low resource language model pretraining, when less than 100 thousand sentences are available. We find that, in very low-resource scenarios, statistical n-gram language models outperform state-of-the-art neural models. Our experiments show that this is mainly due to the focus of the former on a local context. As such, we introduce three methods to improve a neural model’s performance in the low-resource setting, finding that limiting the model’s self-attention is the most effective one, improving on downstream tasks such as NLI and POS tagging by up to 5% for the languages we test on: English, Hindi, and Turkish.
Anthology ID:
2021.icon-main.12
Volume:
Proceedings of the 18th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2021
Address:
National Institute of Technology Silchar, Silchar, India
Editors:
Sivaji Bandyopadhyay, Sobha Lalitha Devi, Pushpak Bhattacharyya
Venue:
ICON
SIG:
Publisher:
NLP Association of India (NLPAI)
Note:
Pages:
86–92
Language:
URL:
https://aclanthology.org/2021.icon-main.12
DOI:
Bibkey:
Cite (ACL):
Lukas Edman, Antonio Toral, and Gertjan van Noord. 2021. The Importance of Context in Very Low Resource Language Modeling. In Proceedings of the 18th International Conference on Natural Language Processing (ICON), pages 86–92, National Institute of Technology Silchar, Silchar, India. NLP Association of India (NLPAI).
Cite (Informal):
The Importance of Context in Very Low Resource Language Modeling (Edman et al., ICON 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2021.icon-main.12.pdf