Online Neural Coreference Resolution with Rollback

Patrick Xia, Benjamin Van Durme


Abstract
Humans process natural language online, whether reading a document or participating in multiparty dialogue. Recent advances in neural coreference resolution have focused on offline approaches that assume the full communication history as input. This is neither realistic nor sufficient if we wish to support dialogue understanding in real-time. We benchmark two existing, offline, models and highlight their shortcomings in the online setting. We then modify these models to perform online inference and introduce rollback: a short-term mechanism to correct mistakes. We demonstrate across five English datasets the effectiveness of this approach against an offline and a naive online model in terms of latency, final document-level coreference F1, and average running F1.
Anthology ID:
2022.crac-1.2
Volume:
Proceedings of the Fifth Workshop on Computational Models of Reference, Anaphora and Coreference
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
CRAC
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13–21
Language:
URL:
https://aclanthology.org/2022.crac-1.2
DOI:
Bibkey:
Cite (ACL):
Patrick Xia and Benjamin Van Durme. 2022. Online Neural Coreference Resolution with Rollback. In Proceedings of the Fifth Workshop on Computational Models of Reference, Anaphora and Coreference, pages 13–21, Gyeongju, Republic of Korea. Association for Computational Linguistics.
Cite (Informal):
Online Neural Coreference Resolution with Rollback (Xia & Van Durme, CRAC 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.crac-1.2.pdf