Abstract
We present a memory-based model for context- dependent semantic parsing. Previous approaches focus on enabling the decoder to copy or modify the parse from the previous utterance, assuming there is a dependency between the current and previous parses. In this work, we propose to represent contextual information using an external memory. We learn a context memory controller that manages the memory by maintaining the cumulative meaning of sequential user utterances. We evaluate our approach on three semantic parsing benchmarks. Experimental results show that our model can better process context-dependent information and demonstrates improved performance without using task-specific decoders.- Anthology ID:
- 2021.tacl-1.71
- Volume:
- Transactions of the Association for Computational Linguistics, Volume 9
- Month:
- Year:
- 2021
- Address:
- Cambridge, MA
- Editors:
- Brian Roark, Ani Nenkova
- Venue:
- TACL
- SIG:
- Publisher:
- MIT Press
- Note:
- Pages:
- 1197–1212
- Language:
- URL:
- https://aclanthology.org/2021.tacl-1.71
- DOI:
- 10.1162/tacl_a_00422
- Cite (ACL):
- Parag Jain and Mirella Lapata. 2021. Memory-Based Semantic Parsing. Transactions of the Association for Computational Linguistics, 9:1197–1212.
- Cite (Informal):
- Memory-Based Semantic Parsing (Jain & Lapata, TACL 2021)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2021.tacl-1.71.pdf