Abstract
Recently, large-scale pre-trained neural network models such as BERT have achieved many state-of-the-art results in natural language processing. Recent work has explored the linguistic capacities of these models. However, no work has focused on the ability of these models to generalize these capacities to novel words. This type of generalization is exhibited by humans, and is intimately related to morphology—humans are in many cases able to identify inflections of novel words in the appropriate context. This type of morphological capacity has not been previously tested in BERT models, and is important for morphologically-rich languages, which are under-studied in the literature regarding BERT’s linguistic capacities. In this work, we investigate this by considering monolingual and multilingual BERT models’ abilities to agree in number with novel plural words in English, French, German, Spanish, and Dutch. We find that many models are not able to reliably determine plurality of novel words, suggesting potential deficiencies in the morphological capacities of BERT models.- Anthology ID:
- 2020.blackboxnlp-1.31
- Volume:
- Proceedings of the Third BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Editors:
- Afra Alishahi, Yonatan Belinkov, Grzegorz Chrupała, Dieuwke Hupkes, Yuval Pinter, Hassan Sajjad
- Venue:
- BlackboxNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 333–341
- Language:
- URL:
- https://aclanthology.org/2020.blackboxnlp-1.31
- DOI:
- 10.18653/v1/2020.blackboxnlp-1.31
- Cite (ACL):
- Coleman Haley. 2020. This is a BERT. Now there are several of them. Can they generalize to novel words?. In Proceedings of the Third BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP, pages 333–341, Online. Association for Computational Linguistics.
- Cite (Informal):
- This is a BERT. Now there are several of them. Can they generalize to novel words? (Haley, BlackboxNLP 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-3/2020.blackboxnlp-1.31.pdf