LAGr: Label Aligned Graphs for Better Systematic Generalization in Semantic Parsing

Dora Jambor, Dzmitry Bahdanau


Abstract
Semantic parsing is the task of producing structured meaning representations for natural language sentences. Recent research has pointed out that the commonly-used sequence-to-sequence (seq2seq) semantic parsers struggle to generalize systematically, i.e. to handle examples that require recombining known knowledge in novel settings. In this work, we show that better systematic generalization can be achieved by producing the meaning representation directly as a graph and not as a sequence. To this end we propose LAGr (Label Aligned Graphs), a general framework to produce semantic parses by independently predicting node and edge labels for a complete multi-layer input-aligned graph. The strongly-supervised LAGr algorithm requires aligned graphs as inputs, whereas weakly-supervised LAGr infers alignments for originally unaligned target graphs using approximate maximum-a-posteriori inference. Experiments demonstrate that LAGr achieves significant improvements in systematic generalization upon the baseline seq2seq parsers in both strongly- and weakly-supervised settings.
Anthology ID:
2022.acl-long.233
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3295–3308
Language:
URL:
https://aclanthology.org/2022.acl-long.233
DOI:
10.18653/v1/2022.acl-long.233
Bibkey:
Cite (ACL):
Dora Jambor and Dzmitry Bahdanau. 2022. LAGr: Label Aligned Graphs for Better Systematic Generalization in Semantic Parsing. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3295–3308, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
LAGr: Label Aligned Graphs for Better Systematic Generalization in Semantic Parsing (Jambor & Bahdanau, ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.acl-long.233.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2022.acl-long.233.mp4
Code
 elementai/lagr
Data
CFQ