Abstract
Transformer-based models are now predominant in NLP.They outperform approaches based on static models in many respects.This success has in turn prompted research that reveals a number of biases in the language models generated by transformers.In this paper we utilize this research on biases to investigate to what extent transformer-based language models allow for extracting knowledge about object relations (X occurs in Y; X consists of Z; action A involves using X).To this end, we compare contextualized models with their static counterparts. We make this comparison dependent on the application of a number of similarity measures and classifiers.Our results are threefold:Firstly, we show that the models combined with the different similarity measures differ greatly in terms of the amount of knowledge they allow for extracting.Secondly, our results suggest that similarity measures perform much worse than classifier-based approaches.Thirdly, we show that, surprisingly, static models perform almost as well as contextualized models – in some cases even better.- Anthology ID:
- 2022.naacl-main.425
- Volume:
- Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
- Month:
- July
- Year:
- 2022
- Address:
- Seattle, United States
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 5791–5807
- Language:
- URL:
- https://aclanthology.org/2022.naacl-main.425
- DOI:
- 10.18653/v1/2022.naacl-main.425
- Cite (ACL):
- Alexander Henlein and Alexander Mehler. 2022. What do Toothbrushes do in the Kitchen? How Transformers Think our World is Structured. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 5791–5807, Seattle, United States. Association for Computational Linguistics.
- Cite (Informal):
- What do Toothbrushes do in the Kitchen? How Transformers Think our World is Structured (Henlein & Mehler, NAACL 2022)
- PDF:
- https://preview.aclanthology.org/remove-xml-comments/2022.naacl-main.425.pdf