DECAF: A Dynamically Extensible Corpus Analysis Framework

Max Müller-Eberstein, Rob Van Der Goot, Anna Rogers


Abstract
The study of generalization in Language Models (LMs) requires controlled experiments that can precisely measure complex linguistic variations between training and testing datasets. We introduce DECAF, a framework that enables the analysis and filtering of linguistically-annotated datasets down to the character level. Rather than creating new resources for each experiment, DECAF starts from datasets with existing linguistic annotations, and leverages them to analyze, filter, and generate highly controlled and reproducible experimental settings targeting specific research questions. We demonstrate DECAF’s functionality by adding 28 morphosyntactic annotation layers to the 115M-word BabyLM corpus and indexing the resulting 1.1B annotations to analyze its internal domain variance, and to create a controlled training data curriculum for a small-scale gender bias study. We release DECAF as an open-source Python library, along with the parsed and indexed version of BabyLM, as resources for future generalization research.
Anthology ID:
2025.acl-demo.34
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 3: System Demonstrations)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Pushkar Mishra, Smaranda Muresan, Tao Yu
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
351–362
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-demo.34/
DOI:
Bibkey:
Cite (ACL):
Max Müller-Eberstein, Rob Van Der Goot, and Anna Rogers. 2025. DECAF: A Dynamically Extensible Corpus Analysis Framework. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 3: System Demonstrations), pages 351–362, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
DECAF: A Dynamically Extensible Corpus Analysis Framework (Müller-Eberstein et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-demo.34.pdf
Copyright agreement:
 2025.acl-demo.34.copyright_agreement.pdf