DSM: Question Generation over Knowledge Base via Modeling Diverse Subgraphs with Meta-learner

Shasha Guo, Jing Zhang, Yanling Wang, Qianyi Zhang, Cuiping Li, Hong Chen


Abstract
Existing methods on knowledge base question generation (KBQG) learn a one-size-fits-all model by training together all subgraphs without distinguishing the diverse semantics of subgraphs. In this work, we show that making use of the past experience on semantically similar subgraphs can reduce the learning difficulty and promote the performance of KBQG models. To achieve this, we propose a novel approach to model diverse subgraphs with meta-learner (DSM). Specifically, we devise a graph contrastive learning-based retriever to identify semantically similar subgraphs, so that we can construct the semantics-aware learning tasks for the meta-learner to learn semantics-specific and semantics-agnostic knowledge on and across these tasks. Extensive experiments on two widely-adopted benchmarks for KBQG show that DSM derives new state-of-the-art performance and benefits the question answering tasks as a means of data augmentation.
Anthology ID:
2022.emnlp-main.281
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4194–4207
Language:
URL:
https://aclanthology.org/2022.emnlp-main.281
DOI:
10.18653/v1/2022.emnlp-main.281
Bibkey:
Cite (ACL):
Shasha Guo, Jing Zhang, Yanling Wang, Qianyi Zhang, Cuiping Li, and Hong Chen. 2022. DSM: Question Generation over Knowledge Base via Modeling Diverse Subgraphs with Meta-learner. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 4194–4207, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
DSM: Question Generation over Knowledge Base via Modeling Diverse Subgraphs with Meta-learner (Guo et al., EMNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2022.emnlp-main.281.pdf