Abstract
Gender bias has been found in existing coreference resolvers. In order to eliminate gender bias, a gender-balanced dataset Gendered Ambiguous Pronouns (GAP) has been released and the best baseline model achieves only 66.9% F1. Bidirectional Encoder Representations from Transformers (BERT) has broken several NLP task records and can be used on GAP dataset. However, fine-tune BERT on a specific task is computationally expensive. In this paper, we propose an end-to-end resolver by combining pre-trained BERT with Relational Graph Convolutional Network (R-GCN). R-GCN is used for digesting structural syntactic information and learning better task-specific embeddings. Empirical results demonstrate that, under explicit syntactic supervision and without the need to fine tune BERT, R-GCN’s embeddings outperform the original BERT embeddings on the coreference task. Our work significantly improves the snippet-context baseline F1 score on GAP dataset from 66.9% to 80.3%. We participated in the Gender Bias for Natural Language Processing 2019 shared task, and our codes are available online.- Anthology ID:
- W19-3814
- Volume:
- Proceedings of the First Workshop on Gender Bias in Natural Language Processing
- Month:
- August
- Year:
- 2019
- Address:
- Florence, Italy
- Venue:
- GeBNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 96–101
- Language:
- URL:
- https://aclanthology.org/W19-3814
- DOI:
- 10.18653/v1/W19-3814
- Cite (ACL):
- Yinchuan Xu and Junlin Yang. 2019. Look Again at the Syntax: Relational Graph Convolutional Network for Gendered Ambiguous Pronoun Resolution. In Proceedings of the First Workshop on Gender Bias in Natural Language Processing, pages 96–101, Florence, Italy. Association for Computational Linguistics.
- Cite (Informal):
- Look Again at the Syntax: Relational Graph Convolutional Network for Gendered Ambiguous Pronoun Resolution (Xu & Yang, GeBNLP 2019)
- PDF:
- https://preview.aclanthology.org/starsem-semeval-split/W19-3814.pdf
- Code
- ianycxu/RGCN-with-BERT