Exploring Span Representations in Neural Coreference Resolution

Patrick Kahardipraja, Olena Vyshnevska, Sharid Loáiciga


Abstract
In coreference resolution, span representations play a key role to predict coreference links accurately. We present a thorough examination of the span representation derived by applying BERT on coreference resolution (Joshi et al., 2019) using a probing model. Our results show that the span representation is able to encode a significant amount of coreference information. In addition, we find that the head-finding attention mechanism involved in creating the spans is crucial in encoding coreference knowledge. Last, our analysis shows that the span representation cannot capture non-local coreference as efficiently as local coreference.
Anthology ID:
2020.codi-1.4
Volume:
Proceedings of the First Workshop on Computational Approaches to Discourse
Month:
November
Year:
2020
Address:
Online
Venue:
CODI
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
32–41
Language:
URL:
https://aclanthology.org/2020.codi-1.4
DOI:
10.18653/v1/2020.codi-1.4
Bibkey:
Cite (ACL):
Patrick Kahardipraja, Olena Vyshnevska, and Sharid Loáiciga. 2020. Exploring Span Representations in Neural Coreference Resolution. In Proceedings of the First Workshop on Computational Approaches to Discourse, pages 32–41, Online. Association for Computational Linguistics.
Cite (Informal):
Exploring Span Representations in Neural Coreference Resolution (Kahardipraja et al., CODI 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/remove-xml-comments/2020.codi-1.4.pdf
Video:
 https://slideslive.com/38939689
Code
 pkhdipraja/exploring-span-representations