PairSpanBERT: An Enhanced Language Model for Bridging Resolution

Hideo Kobayashi, Yufang Hou, Vincent Ng


Abstract
We present PairSpanBERT, a SpanBERT-based pre-trained model specialized for bridging resolution. To this end, we design a novel pre-training objective that aims to learn the contexts in which two mentions are implicitly linked to each other from a large amount of data automatically generated either heuristically or via distance supervision with a knowledge graph. Despite the noise inherent in the automatically generated data, we achieve the best results reported to date on three evaluation datasets for bridging resolution when replacing SpanBERT with PairSpanBERT in a state-of-the-art resolver that jointly performs entity coreference resolution and bridging resolution.
Anthology ID:
2023.acl-long.383
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6931–6946
Language:
URL:
https://aclanthology.org/2023.acl-long.383
DOI:
Bibkey:
Cite (ACL):
Hideo Kobayashi, Yufang Hou, and Vincent Ng. 2023. PairSpanBERT: An Enhanced Language Model for Bridging Resolution. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 6931–6946, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
PairSpanBERT: An Enhanced Language Model for Bridging Resolution (Kobayashi et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/starsem-semeval-split/2023.acl-long.383.pdf