Alex Lyman
Fixing paper assignments
- Please select all papers that belong to the same person.
- Indicate below which author they should be assigned to.
TODO: "submit" and "cancel" buttons here
2024
WhatIf: Leveraging Word Vectors for Small-Scale Data Augmentation
Alex Lyman
|
Bryce Hepner
The 2nd BabyLM Challenge at the 28th Conference on Computational Natural Language Learning
We introduce WhatIf, a lightly supervised data augmentation technique that leverages word vectors to enhance training data for small-scale language models. Inspired by reading prediction strategies used in education, WhatIf creates new samples by substituting semantically similar words in the training data. We evaluate WhatIf on multiple datasets, demonstrating small but consistent improvements in downstream evaluation compared to baseline models. Finally, we compare WhatIf to other small-scale data augmentation techniques and find that it provides comparable quantitative results at a potential tradeoff to qualitative evaluation.