Abstract
We present a new architecture for storing and accessing entity mentions during online text processing. While reading the text, entity references are identified, and may be stored by either updating or overwriting a cell in a fixed-length memory. The update operation implies coreference with the other mentions that are stored in the same cell; the overwrite operation causes these mentions to be forgotten. By encoding the memory operations as differentiable gates, it is possible to train the model end-to-end, using both a supervised anaphora resolution objective as well as a supplementary language modeling objective. Evaluation on a dataset of pronoun-name anaphora demonstrates strong performance with purely incremental text processing.- Anthology ID:
- P19-1593
- Volume:
- Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
- Month:
- July
- Year:
- 2019
- Address:
- Florence, Italy
- Editors:
- Anna Korhonen, David Traum, Lluís Màrquez
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 5918–5925
- Language:
- URL:
- https://aclanthology.org/P19-1593
- DOI:
- 10.18653/v1/P19-1593
- Cite (ACL):
- Fei Liu, Luke Zettlemoyer, and Jacob Eisenstein. 2019. The Referential Reader: A Recurrent Entity Network for Anaphora Resolution. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 5918–5925, Florence, Italy. Association for Computational Linguistics.
- Cite (Informal):
- The Referential Reader: A Recurrent Entity Network for Anaphora Resolution (Liu et al., ACL 2019)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/P19-1593.pdf
- Code
- liufly/refreader
- Data
- GAP Coreference Dataset