Document-Level Event Role Filler Extraction using Multi-Granularity Contextualized Encoding

Xinya Du, Claire Cardie


Abstract
Few works in the literature of event extraction have gone beyond individual sentences to make extraction decisions. This is problematic when the information needed to recognize an event argument is spread across multiple sentences. We argue that document-level event extraction is a difficult task since it requires a view of a larger context to determine which spans of text correspond to event role fillers. We first investigate how end-to-end neural sequence models (with pre-trained language model representations) perform on document-level role filler extraction, as well as how the length of context captured affects the models’ performance. To dynamically aggregate information captured by neural representations learned at different levels of granularity (e.g., the sentence- and paragraph-level), we propose a novel multi-granularity reader. We evaluate our models on the MUC-4 event extraction dataset, and show that our best system performs substantially better than prior work. We also report findings on the relationship between context length and neural model performance on the task.
Anthology ID:
2020.acl-main.714
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8010–8020
Language:
URL:
https://aclanthology.org/2020.acl-main.714
DOI:
10.18653/v1/2020.acl-main.714
Bibkey:
Cite (ACL):
Xinya Du and Claire Cardie. 2020. Document-Level Event Role Filler Extraction using Multi-Granularity Contextualized Encoding. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 8010–8020, Online. Association for Computational Linguistics.
Cite (Informal):
Document-Level Event Role Filler Extraction using Multi-Granularity Contextualized Encoding (Du & Cardie, ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2020.acl-main.714.pdf
Video:
 http://slideslive.com/38928745
Code
 xinyadu/doc_event_role