Abstract
We explore the use of large language models (LLMs) for zero-shot semantic parsing. Semantic parsing involves mapping natural language utterances to task-specific meaning representations. LLMs are generally trained on publicly available text and code and cannot be expected to directly generalize to domain-specific parsing tasks in a zero-shot setting. In this work, we propose ZEROTOP, a zero-shot task-oriented parsing method that decomposes semantic parsing problem into a set of abstractive and extractive question-answering (QA) problems. For each utterance, we prompt the LLM with questions corresponding to its top-level intent and a set of slots and use the LLM generations to construct the target meaning representation. We observe that current LLMs fail to detect unanswerable questions; and as a result, cannot handle questions corresponding to missing slots. We address this by fine-tuning a language model on public QA datasets using synthetic negative samples. Experimental results show that our QA-based decomposition paired with the fine-tuned LLM can zero-shot parse ≈ 16% of utterances in the MTOP dataset.- Anthology ID:
- 2023.emnlp-main.354
- Volume:
- Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 5792–5799
- Language:
- URL:
- https://aclanthology.org/2023.emnlp-main.354
- DOI:
- 10.18653/v1/2023.emnlp-main.354
- Cite (ACL):
- Dheeraj Mekala, Jason Wolfe, and Subhro Roy. 2023. ZEROTOP: Zero-Shot Task-Oriented Semantic Parsing using Large Language Models. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 5792–5799, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- ZEROTOP: Zero-Shot Task-Oriented Semantic Parsing using Large Language Models (Mekala et al., EMNLP 2023)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/2023.emnlp-main.354.pdf