Probing for Multilingual Numerical Understanding in Transformer-Based Language Models

Devin Johnson, Denise Mak, Andrew Barker, Lexi Loessberg-Zahl


Abstract
Natural language numbers are an example of compositional structures, where larger numbers are composed of operations on smaller numbers. Given that compositional reasoning is a key to natural language understanding, we propose novel multilingual probing tasks tested on DistilBERT, XLM, and BERT to investigate for evidence of compositional reasoning over numerical data in various natural language number systems. By using both grammaticality judgment and value comparison classification tasks in English, Japanese, Danish, and French, we find evidence that the information encoded in these pretrained models’ embeddings is sufficient for grammaticality judgments but generally not for value comparisons. We analyze possible reasons for this and discuss how our tasks could be extended in further studies.
Anthology ID:
2020.blackboxnlp-1.18
Volume:
Proceedings of the Third BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP
Month:
November
Year:
2020
Address:
Online
Venue:
BlackboxNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
184–192
Language:
URL:
https://aclanthology.org/2020.blackboxnlp-1.18
DOI:
10.18653/v1/2020.blackboxnlp-1.18
Bibkey:
Cite (ACL):
Devin Johnson, Denise Mak, Andrew Barker, and Lexi Loessberg-Zahl. 2020. Probing for Multilingual Numerical Understanding in Transformer-Based Language Models. In Proceedings of the Third BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP, pages 184–192, Online. Association for Computational Linguistics.
Cite (Informal):
Probing for Multilingual Numerical Understanding in Transformer-Based Language Models (Johnson et al., BlackboxNLP 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2020.blackboxnlp-1.18.pdf
Code
 dj1121/tlm_num_probe