Anna Karenina Strikes Again: Pre-Trained LLM Embeddings May Favor High-Performing Learners
Abigail Gurin Schleifer, Beata Beigman Klebanov, Moriah Ariely, Giora Alexandron
Abstract
Unsupervised clustering of student responses to open-ended questions into behavioral and cognitive profiles using pre-trained LLM embeddings is an emerging technique, but little is known about how well this captures pedagogically meaningful information. We investigate this in the context of student responses to open-ended questions in biology, which were previously analyzed and clustered by experts into theory-driven Knowledge Profiles (KPs).Comparing these KPs to ones discovered by purely data-driven clustering techniques, we report poor discoverability of most KPs, except for the ones including the correct answers. We trace this ‘discoverability bias’ to the representations of KPs in the pre-trained LLM embeddings space.- Anthology ID:
- 2024.bea-1.32
- Original:
- 2024.bea-1.32v1
- Version 2:
- 2024.bea-1.32v2
- Volume:
- Proceedings of the 19th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2024)
- Month:
- June
- Year:
- 2024
- Address:
- Mexico City, Mexico
- Editors:
- Ekaterina Kochmar, Marie Bexte, Jill Burstein, Andrea Horbach, Ronja Laarmann-Quante, Anaïs Tack, Victoria Yaneva, Zheng Yuan
- Venue:
- BEA
- SIG:
- SIGEDU
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 391–402
- Language:
- URL:
- https://aclanthology.org/2024.bea-1.32
- DOI:
- Cite (ACL):
- Abigail Gurin Schleifer, Beata Beigman Klebanov, Moriah Ariely, and Giora Alexandron. 2024. Anna Karenina Strikes Again: Pre-Trained LLM Embeddings May Favor High-Performing Learners. In Proceedings of the 19th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2024), pages 391–402, Mexico City, Mexico. Association for Computational Linguistics.
- Cite (Informal):
- Anna Karenina Strikes Again: Pre-Trained LLM Embeddings May Favor High-Performing Learners (Gurin Schleifer et al., BEA 2024)
- PDF:
- https://preview.aclanthology.org/landing_page/2024.bea-1.32.pdf