Compositional Generalization for Neural Semantic Parsing via Span-level Supervised Attention

Pengcheng Yin, Hao Fang, Graham Neubig, Adam Pauls, Emmanouil Antonios Platanios, Yu Su, Sam Thomson, Jacob Andreas


Abstract
We describe a span-level supervised attention loss that improves compositional generalization in semantic parsers. Our approach builds on existing losses that encourage attention maps in neural sequence-to-sequence models to imitate the output of classical word alignment algorithms. Where past work has used word-level alignments, we focus on spans; borrowing ideas from phrase-based machine translation, we align subtrees in semantic parses to spans of input sentences, and encourage neural attention mechanisms to mimic these alignments. This method improves the performance of transformers, RNNs, and structured decoders on three benchmarks of compositional generalization.
Anthology ID:
2021.naacl-main.225
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2810–2823
Language:
URL:
https://aclanthology.org/2021.naacl-main.225
DOI:
10.18653/v1/2021.naacl-main.225
Bibkey:
Cite (ACL):
Pengcheng Yin, Hao Fang, Graham Neubig, Adam Pauls, Emmanouil Antonios Platanios, Yu Su, Sam Thomson, and Jacob Andreas. 2021. Compositional Generalization for Neural Semantic Parsing via Span-level Supervised Attention. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 2810–2823, Online. Association for Computational Linguistics.
Cite (Informal):
Compositional Generalization for Neural Semantic Parsing via Span-level Supervised Attention (Yin et al., NAACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2021.naacl-main.225.pdf
Video:
 https://preview.aclanthology.org/add_acl24_videos/2021.naacl-main.225.mp4
Data
CFQ