MALAMUTE: A Multilingual, Highly-granular, Template-free, Education-based Probing Dataset
Sagi Shaier, George Arthur Baker, Chiranthan Sridhar, Lawrence Hunter, Katharina Von Der Wense
Abstract
Language models (LMs) have excelled in various broad domains. However, to ensure their safe and effective integration into real-world educational settings, they must demonstrate proficiency in specific, granular areas of knowledge. Existing cloze-style benchmarks, commonly used to evaluate LMs’ knowledge, have three major limitations. They: 1) do not cover the educational domain; 2) typically focus on low-complexity, generic knowledge or broad domains, which do not adequately assess the models’ knowledge in specific subjects; and 3) often rely on templates that can bias model predictions. Here, we introduce MALAMUTE, a multilingual, template-free, and highly granular probing dataset comprising expert-written, peer-reviewed probes from 71 university-level textbooks across three languages (English, Spanish, and Polish). MALAMUTE is the first education-based cloze-style dataset. It covers eight domains, each with up to 14 subdomains, further broken down into concepts and concept-based prompts, totaling 33,361 university curriculum concepts and 116,887 prompts. MALAMUTE’s fine granularity, educational focus, and inclusion of both sentence-level and paragraph-level prompts make it an ideal tool for evaluating LMs’ course-related knowledge. Our evaluation of masked and causal LMs on MALAMUTE shows that despite overall proficiency, they have significant gaps in knowledge when examined closely on specific subjects, hindering their safe use in classrooms and underscoring the need for further development.- Anthology ID:
- 2025.findings-acl.209
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2025
- Month:
- July
- Year:
- 2025
- Address:
- Vienna, Austria
- Editors:
- Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 4051–4069
- Language:
- URL:
- https://preview.aclanthology.org/corrections-2025-08/2025.findings-acl.209/
- DOI:
- 10.18653/v1/2025.findings-acl.209
- Cite (ACL):
- Sagi Shaier, George Arthur Baker, Chiranthan Sridhar, Lawrence Hunter, and Katharina Von Der Wense. 2025. MALAMUTE: A Multilingual, Highly-granular, Template-free, Education-based Probing Dataset. In Findings of the Association for Computational Linguistics: ACL 2025, pages 4051–4069, Vienna, Austria. Association for Computational Linguistics.
- Cite (Informal):
- MALAMUTE: A Multilingual, Highly-granular, Template-free, Education-based Probing Dataset (Shaier et al., Findings 2025)
- PDF:
- https://preview.aclanthology.org/corrections-2025-08/2025.findings-acl.209.pdf