Zero Pronoun Resolution with Attention-based Neural Network

Qingyu Yin, Yu Zhang, Weinan Zhang, Ting Liu, William Yang Wang


Abstract
Recent neural network methods for zero pronoun resolution explore multiple models for generating representation vectors for zero pronouns and their candidate antecedents. Typically, contextual information is utilized to encode the zero pronouns since they are simply gaps that contain no actual content. To better utilize contexts of the zero pronouns, we here introduce the self-attention mechanism for encoding zero pronouns. With the help of the multiple hops of attention, our model is able to focus on some informative parts of the associated texts and therefore produces an efficient way of encoding the zero pronouns. In addition, an attention-based recurrent neural network is proposed for encoding candidate antecedents by their contents. Experiment results are encouraging: our proposed attention-based model gains the best performance on the Chinese portion of the OntoNotes corpus, substantially surpasses existing Chinese zero pronoun resolution baseline systems.
Anthology ID:
C18-1002
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13–23
Language:
URL:
https://aclanthology.org/C18-1002
DOI:
Bibkey:
Cite (ACL):
Qingyu Yin, Yu Zhang, Weinan Zhang, Ting Liu, and William Yang Wang. 2018. Zero Pronoun Resolution with Attention-based Neural Network. In Proceedings of the 27th International Conference on Computational Linguistics, pages 13–23, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Zero Pronoun Resolution with Attention-based Neural Network (Yin et al., COLING 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/C18-1002.pdf
Code
 qyyin/AttentionZP