Transfer Learning from Pre-trained BERT for Pronoun Resolution

Xingce Bao, Qianqian Qiao


Abstract
The paper describes the submission of the team “We used bert!” to the shared task Gendered Pronoun Resolution (Pair pronouns to their correct entities). Our final submission model based on the fine-tuned BERT (Bidirectional Encoder Representations from Transformers) ranks 14th among 838 teams with a multi-class logarithmic loss of 0.208. In this work, contribution of transfer learning technique to pronoun resolution systems is investigated and the gender bias contained in classification models is evaluated.
Anthology ID:
W19-3812
Volume:
Proceedings of the First Workshop on Gender Bias in Natural Language Processing
Month:
August
Year:
2019
Address:
Florence, Italy
Editors:
Marta R. Costa-jussà, Christian Hardmeier, Will Radford, Kellie Webster
Venue:
GeBNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
82–88
Language:
URL:
https://aclanthology.org/W19-3812
DOI:
10.18653/v1/W19-3812
Bibkey:
Cite (ACL):
Xingce Bao and Qianqian Qiao. 2019. Transfer Learning from Pre-trained BERT for Pronoun Resolution. In Proceedings of the First Workshop on Gender Bias in Natural Language Processing, pages 82–88, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Transfer Learning from Pre-trained BERT for Pronoun Resolution (Bao & Qiao, GeBNLP 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/W19-3812.pdf
Data
GAP Coreference Dataset