@inproceedings{wang-2019-msnet,
    title = "{MS}net: A {BERT}-based Network for Gendered Pronoun Resolution",
    author = "Wang, Zili",
    editor = "Costa-juss{\`a}, Marta R.  and
      Hardmeier, Christian  and
      Radford, Will  and
      Webster, Kellie",
    booktitle = "Proceedings of the First Workshop on Gender Bias in Natural Language Processing",
    month = aug,
    year = "2019",
    address = "Florence, Italy",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/W19-3813/",
    doi = "10.18653/v1/W19-3813",
    pages = "89--95",
    abstract = "The pre-trained BERT model achieves a remarkable state of the art across a wide range of tasks in natural language processing. For solving the gender bias in gendered pronoun resolution task, I propose a novel neural network model based on the pre-trained BERT. This model is a type of mention score classifier and uses an attention mechanism with no parameters to compute the contextual representation of entity span, and a vector to represent the triple-wise semantic similarity among the pronoun and the entities. In stage 1 of the gendered pronoun resolution task, a variant of this model, trained in the fine-tuning approach, reduced the multi-class logarithmic loss to 0.3033 in the 5-fold cross-validation of training set and 0.2795 in testing set. Besides, this variant won the 2nd place with a score at 0.17289 in stage 2 of the task. The code in this paper is available at: \url{https://github.com/ziliwang/MSnet-for-Gendered-Pronoun-Resolution}"
}Markdown (Informal)
[MSnet: A BERT-based Network for Gendered Pronoun Resolution](https://preview.aclanthology.org/iwcs-25-ingestion/W19-3813/) (Wang, GeBNLP 2019)
ACL