Theresa Biberauer
2025
Profiling neural grammar induction on morphemically tokenised child-directed speech
Mila Marcheva
|
Theresa Biberauer
|
Weiwei Sun
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics
We investigate the performance of state-of-the-art (SotA) neural grammar induction (GI) models on a morphemically tokenised English dataset based on the CHILDES treebank (Pearl and Sprouse, 2013). Using implementations from Yang et al. (2021a), we train models and evaluate them with the standard F1 score. We introduce novel evaluation metrics—depth-of-morpheme and sibling-of-morpheme—which measure phenomena around bound morpheme attachment. Our results reveal that models with the highest F1 scores do not necessarily induce linguistically plausible structures for bound morpheme attachment, highlighting a key challenge for cognitively plausible GI.