Abstract
Sentence encoders map sentences to real valued vectors for use in downstream applications. To peek into these representations—e.g., to increase interpretability of their results—probing tasks have been designed which query them for linguistic knowledge. However, designing probing tasks for lesser-resourced languages is tricky, because these often lack largescale annotated data or (high-quality) dependency parsers as a prerequisite of probing task design in English. To investigate how to probe sentence embeddings in such cases, we investigate sensitivity of probing task results to structural design choices, conducting the first such large scale study. We show that design choices like size of the annotated probing dataset and type of classifier used for evaluation do (sometimes substantially) influence probing outcomes. We then probe embeddings in a multilingual setup with design choices that lie in a ‘stable region’, as we identify for English, and find that results on English do not transfer to other languages. Fairer and more comprehensive sentence-level probing evaluation should thus be carried out on multiple languages in the future.- Anthology ID:
- 2020.conll-1.8
- Volume:
- Proceedings of the 24th Conference on Computational Natural Language Learning
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Venue:
- CoNLL
- SIG:
- SIGNLL
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 108–118
- Language:
- URL:
- https://aclanthology.org/2020.conll-1.8
- DOI:
- 10.18653/v1/2020.conll-1.8
- Cite (ACL):
- Steffen Eger, Johannes Daxenberger, and Iryna Gurevych. 2020. How to Probe Sentence Embeddings in Low-Resource Languages: On Structural Design Choices for Probing Task Evaluation. In Proceedings of the 24th Conference on Computational Natural Language Learning, pages 108–118, Online. Association for Computational Linguistics.
- Cite (Informal):
- How to Probe Sentence Embeddings in Low-Resource Languages: On Structural Design Choices for Probing Task Evaluation (Eger et al., CoNLL 2020)
- PDF:
- https://preview.aclanthology.org/author-url/2020.conll-1.8.pdf
- Code
- UKPLab/conll2020-multilingual-sentence-probing
- Data
- SentEval, Universal Dependencies