Abstract
We present a system which parses sentences into Abstract Meaning Representations, improving state-of-the-art results for this task by more than 5%. AMR graphs represent semantic content using linguistic properties such as semantic roles, coreference, negation, and more. The AMR parser does not rely on a syntactic pre-parse, or heavily engineered features, and uses five recurrent neural networks as the key architectural components for inferring AMR graphs.- Anthology ID:
- P17-1043
- Volume:
- Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
- Month:
- July
- Year:
- 2017
- Address:
- Vancouver, Canada
- Editors:
- Regina Barzilay, Min-Yen Kan
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 463–472
- Language:
- URL:
- https://aclanthology.org/P17-1043
- DOI:
- 10.18653/v1/P17-1043
- Cite (ACL):
- William Foland and James H. Martin. 2017. Abstract Meaning Representation Parsing using LSTM Recurrent Neural Networks. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 463–472, Vancouver, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Abstract Meaning Representation Parsing using LSTM Recurrent Neural Networks (Foland & Martin, ACL 2017)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/P17-1043.pdf
- Code
- BillFoland/daisyluAMR