Abstract
We propose an attention-based model that treats AMR parsing as sequence-to-graph transduction. Unlike most AMR parsers that rely on pre-trained aligners, external semantic resources, or data augmentation, our proposed parser is aligner-free, and it can be effectively trained with limited amounts of labeled AMR data. Our experimental results outperform all previously reported SMATCH scores, on both AMR 2.0 (76.3% on LDC2017T10) and AMR 1.0 (70.2% on LDC2014T12).- Anthology ID:
- P19-1009
- Volume:
- Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
- Month:
- July
- Year:
- 2019
- Address:
- Florence, Italy
- Editors:
- Anna Korhonen, David Traum, Lluís Màrquez
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 80–94
- Language:
- URL:
- https://preview.aclanthology.org/build-pipeline-with-new-library/P19-1009/
- DOI:
- 10.18653/v1/P19-1009
- Cite (ACL):
- Sheng Zhang, Xutai Ma, Kevin Duh, and Benjamin Van Durme. 2019. AMR Parsing as Sequence-to-Graph Transduction. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 80–94, Florence, Italy. Association for Computational Linguistics.
- Cite (Informal):
- AMR Parsing as Sequence-to-Graph Transduction (Zhang et al., ACL 2019)
- PDF:
- https://preview.aclanthology.org/build-pipeline-with-new-library/P19-1009.pdf
- Code
- sheng-z/stog
- Data
- LDC2017T10