Multi-level Alignment Pretraining for Multi-lingual Semantic Parsing

Bo Shao, Yeyun Gong, Weizhen Qi, Nan Duan, Xiaola Lin


Abstract
In this paper, we present a multi-level alignment pretraining method in a unified architecture formulti-lingual semantic parsing. In this architecture, we use an adversarial training method toalign the space of different languages and use sentence level and word level parallel corpus assupervision information to align the semantic of different languages. Finally, we jointly train themulti-level alignment and semantic parsing tasks. We conduct experiments on a publicly avail-able multi-lingual semantic parsing dataset ATIS and a newly constructed dataset. Experimentalresults show that our model outperforms state-of-the-art methods on both datasets.
Anthology ID:
2020.coling-main.289
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3246–3256
Language:
URL:
https://aclanthology.org/2020.coling-main.289
DOI:
10.18653/v1/2020.coling-main.289
Bibkey:
Cite (ACL):
Bo Shao, Yeyun Gong, Weizhen Qi, Nan Duan, and Xiaola Lin. 2020. Multi-level Alignment Pretraining for Multi-lingual Semantic Parsing. In Proceedings of the 28th International Conference on Computational Linguistics, pages 3246–3256, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Multi-level Alignment Pretraining for Multi-lingual Semantic Parsing (Shao et al., COLING 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2020.coling-main.289.pdf
Data
ATIS