Mask-then-Fill: A Flexible and Effective Data Augmentation Framework for Event Extraction

Jun Gao, Changlong Yu, Wei Wang, Huan Zhao, Ruifeng Xu


Abstract
We present Mask-then-Fill, a flexible and effective data augmentation framework for event extraction. Our approach allows for more flexible manipulation of text and thus can generate more diverse data while keeping the original event structure unchanged as much as possible. Specifically, it first randomly masks out an adjunct sentence fragment and then infills a variable-length text span with a fine-tuned infilling model. The main advantage lies in that it can replace a fragment of arbitrary length in the text with another fragment of variable length, compared to the existing methods which can only replace a single word or a fixed-length fragment. On trigger and argument extraction tasks, the proposed framework is more effective than baseline methods and it demonstrates particularly strong results in the low-resource setting. Our further analysis shows that it achieves a good balance between diversity and distributional similarity.
Anthology ID:
2022.findings-emnlp.332
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4537–4544
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.332
DOI:
10.18653/v1/2022.findings-emnlp.332
Bibkey:
Cite (ACL):
Jun Gao, Changlong Yu, Wei Wang, Huan Zhao, and Ruifeng Xu. 2022. Mask-then-Fill: A Flexible and Effective Data Augmentation Framework for Event Extraction. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 4537–4544, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Mask-then-Fill: A Flexible and Effective Data Augmentation Framework for Event Extraction (Gao et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2022.findings-emnlp.332.pdf