On Generalization in Coreference Resolution
Shubham Toshniwal, Patrick Xia, Sam Wiseman, Karen Livescu, Kevin Gimpel
Abstract
While coreference resolution is defined independently of dataset domain, most models for performing coreference resolution do not transfer well to unseen domains. We consolidate a set of 8 coreference resolution datasets targeting different domains to evaluate the off-the-shelf performance of models. We then mix three datasets for training; even though their domain, annotation guidelines, and metadata differ, we propose a method for jointly training a single model on this heterogeneous data mixture by using data augmentation to account for annotation differences and sampling to balance the data quantities. We find that in a zero-shot setting, models trained on a single dataset transfer poorly while joint training yields improved overall performance, leading to better generalization in coreference resolution models. This work contributes a new benchmark for robust coreference resolution and multiple new state-of-the-art results.- Anthology ID:
- 2021.crac-1.12
- Volume:
- Proceedings of the Fourth Workshop on Computational Models of Reference, Anaphora and Coreference
- Month:
- November
- Year:
- 2021
- Address:
- Punta Cana, Dominican Republic
- Editors:
- Maciej Ogrodniczuk, Sameer Pradhan, Massimo Poesio, Yulia Grishina, Vincent Ng
- Venue:
- CRAC
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 111–120
- Language:
- URL:
- https://aclanthology.org/2021.crac-1.12
- DOI:
- 10.18653/v1/2021.crac-1.12
- Cite (ACL):
- Shubham Toshniwal, Patrick Xia, Sam Wiseman, Karen Livescu, and Kevin Gimpel. 2021. On Generalization in Coreference Resolution. In Proceedings of the Fourth Workshop on Computational Models of Reference, Anaphora and Coreference, pages 111–120, Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Cite (Informal):
- On Generalization in Coreference Resolution (Toshniwal et al., CRAC 2021)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2021.crac-1.12.pdf
- Code
- shtoshni92/fast-coref + additional community code
- Data
- GAP Coreference Dataset, LitBank, OntoGUM, OntoNotes 5.0, PreCo, Quizbowl, WSC, WikiCoref