ZPR2: Joint Zero Pronoun Recovery and Resolution using Multi-Task Learning and BERT

Linfeng Song, Kun Xu, Yue Zhang, Jianshu Chen, Dong Yu


Abstract
Zero pronoun recovery and resolution aim at recovering the dropped pronoun and pointing out its anaphoric mentions, respectively. We propose to better explore their interaction by solving both tasks together, while the previous work treats them separately. For zero pronoun resolution, we study this task in a more realistic setting, where no parsing trees or only automatic trees are available, while most previous work assumes gold trees. Experiments on two benchmarks show that joint modeling significantly outperforms our baseline that already beats the previous state of the arts.
Anthology ID:
2020.acl-main.482
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5429–5434
Language:
URL:
https://aclanthology.org/2020.acl-main.482
DOI:
10.18653/v1/2020.acl-main.482
Bibkey:
Cite (ACL):
Linfeng Song, Kun Xu, Yue Zhang, Jianshu Chen, and Dong Yu. 2020. ZPR2: Joint Zero Pronoun Recovery and Resolution using Multi-Task Learning and BERT. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 5429–5434, Online. Association for Computational Linguistics.
Cite (Informal):
ZPR2: Joint Zero Pronoun Recovery and Resolution using Multi-Task Learning and BERT (Song et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2020.acl-main.482.pdf
Video:
 http://slideslive.com/38928741