Abstract
In neural semantic parsing, sentences are mapped to meaning representations using encoder-decoder frameworks. In this paper, we propose to apply the Transformer architecture, instead of recurrent neural networks, to this task. Experiments in two data sets from different domains and with different levels of difficulty show that our model achieved better results than strong baselines in certain settings and competitive results across all our experiments.- Anthology ID:
- 2020.alta-1.16
- Volume:
- Proceedings of the 18th Annual Workshop of the Australasian Language Technology Association
- Month:
- December
- Year:
- 2020
- Address:
- Virtual Workshop
- Editors:
- Maria Kim, Daniel Beck, Meladel Mistica
- Venue:
- ALTA
- SIG:
- Publisher:
- Australasian Language Technology Association
- Note:
- Pages:
- 121–126
- Language:
- URL:
- https://aclanthology.org/2020.alta-1.16
- DOI:
- Cite (ACL):
- Gabriela Ferraro and Hanna Suominen. 2020. Transformer Semantic Parsing. In Proceedings of the 18th Annual Workshop of the Australasian Language Technology Association, pages 121–126, Virtual Workshop. Australasian Language Technology Association.
- Cite (Informal):
- Transformer Semantic Parsing (Ferraro & Suominen, ALTA 2020)
- PDF:
- https://preview.aclanthology.org/ingest-2024-clasp/2020.alta-1.16.pdf