Variational Graph Autoencoding as Cheap Supervision for AMR Coreference Resolution

Irene Li, Linfeng Song, Kun Xu, Dong Yu


Abstract
Coreference resolution over semantic graphs like AMRs aims to group the graph nodes that represent the same entity. This is a crucial step for making document-level formal semantic representations. With annotated data on AMR coreference resolution, deep learning approaches have recently shown great potential for this task, yet they are usually data hunger and annotations are costly. We propose a general pretraining method using variational graph autoencoder (VGAE) for AMR coreference resolution, which can leverage any general AMR corpus and even automatically parsed AMR data. Experiments on benchmarks show that the pretraining approach achieves performance gains of up to 6% absolute F1 points. Moreover, our model significantly improves on the previous state-of-the-art model by up to 11% F1.
Anthology ID:
2022.acl-long.199
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2790–2800
Language:
URL:
https://aclanthology.org/2022.acl-long.199
DOI:
10.18653/v1/2022.acl-long.199
Bibkey:
Cite (ACL):
Irene Li, Linfeng Song, Kun Xu, and Dong Yu. 2022. Variational Graph Autoencoding as Cheap Supervision for AMR Coreference Resolution. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2790–2800, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Variational Graph Autoencoding as Cheap Supervision for AMR Coreference Resolution (Li et al., ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/2022.acl-long.199.pdf
Data
AMR Bank