Abstract
We present an encoder-decoder model for semantic parsing with UCCA SemEval 2019 Task 1. The encoder is a Bi-LSTM and the decoder uses recursive self-attention. The proposed model alleviates challenges and feature engineering in traditional transition-based and graph-based parsers. The resulting parser is simple and proved to effective on the semantic parsing task.- Anthology ID:
- S19-2017
- Volume:
- Proceedings of the 13th International Workshop on Semantic Evaluation
- Month:
- June
- Year:
- 2019
- Address:
- Minneapolis, Minnesota, USA
- Editors:
- Jonathan May, Ekaterina Shutova, Aurelie Herbelot, Xiaodan Zhu, Marianna Apidianaki, Saif M. Mohammad
- Venue:
- SemEval
- SIG:
- SIGLEX
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 119–124
- Language:
- URL:
- https://aclanthology.org/S19-2017
- DOI:
- 10.18653/v1/S19-2017
- Cite (ACL):
- Dian Yu and Kenji Sagae. 2019. UC Davis at SemEval-2019 Task 1: DAG Semantic Parsing with Attention-based Decoder. In Proceedings of the 13th International Workshop on Semantic Evaluation, pages 119–124, Minneapolis, Minnesota, USA. Association for Computational Linguistics.
- Cite (Informal):
- UC Davis at SemEval-2019 Task 1: DAG Semantic Parsing with Attention-based Decoder (Yu & Sagae, SemEval 2019)
- PDF:
- https://preview.aclanthology.org/corrections-2024-05/S19-2017.pdf