UC Davis at SemEval-2019 Task 1: DAG Semantic Parsing with Attention-based Decoder

Dian Yu, Kenji Sagae


Abstract
We present an encoder-decoder model for semantic parsing with UCCA SemEval 2019 Task 1. The encoder is a Bi-LSTM and the decoder uses recursive self-attention. The proposed model alleviates challenges and feature engineering in traditional transition-based and graph-based parsers. The resulting parser is simple and proved to effective on the semantic parsing task.
Anthology ID:
S19-2017
Volume:
Proceedings of the 13th International Workshop on Semantic Evaluation
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota, USA
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
119–124
Language:
URL:
https://aclanthology.org/S19-2017
DOI:
10.18653/v1/S19-2017
Bibkey:
Cite (ACL):
Dian Yu and Kenji Sagae. 2019. UC Davis at SemEval-2019 Task 1: DAG Semantic Parsing with Attention-based Decoder. In Proceedings of the 13th International Workshop on Semantic Evaluation, pages 119–124, Minneapolis, Minnesota, USA. Association for Computational Linguistics.
Cite (Informal):
UC Davis at SemEval-2019 Task 1: DAG Semantic Parsing with Attention-based Decoder (Yu & Sagae, SemEval 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/S19-2017.pdf