Increasing The Performance of Cognitively Inspired Data-Efficient Language Models via Implicit Structure Building

Omar Momen, David Arps, Laura Kallmeyer


Anthology ID:
2023.conll-babylm.29
Volume:
Proceedings of the BabyLM Challenge at the 27th Conference on Computational Natural Language Learning
Month:
December
Year:
2023
Address:
Singapore
Editors:
Alex Warstadt, Aaron Mueller, Leshem Choshen, Ethan Wilcox, Chengxu Zhuang, Juan Ciro, Rafael Mosquera, Bhargavi Paranjabe, Adina Williams, Tal Linzen, Ryan Cotterell
Venues:
CoNLL | BabyLM | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
327–338
Language:
URL:
https://preview.aclanthology.org/Ingest-2025-COMPUTEL/2023.conll-babylm.29/
DOI:
10.18653/v1/2023.conll-babylm.29
Bibkey:
Cite (ACL):
Omar Momen, David Arps, and Laura Kallmeyer. 2023. Increasing The Performance of Cognitively Inspired Data-Efficient Language Models via Implicit Structure Building. In Proceedings of the BabyLM Challenge at the 27th Conference on Computational Natural Language Learning, pages 327–338, Singapore. Association for Computational Linguistics.
Cite (Informal):
Increasing The Performance of Cognitively Inspired Data-Efficient Language Models via Implicit Structure Building (Momen et al., CoNLL-BabyLM 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/Ingest-2025-COMPUTEL/2023.conll-babylm.29.pdf