Shasha Guo
2022
DSM: Question Generation over Knowledge Base via Modeling Diverse Subgraphs with Meta-learner
Shasha Guo
|
Jing Zhang
|
Yanling Wang
|
Qianyi Zhang
|
Cuiping Li
|
Hong Chen
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Existing methods on knowledge base question generation (KBQG) learn a one-size-fits-all model by training together all subgraphs without distinguishing the diverse semantics of subgraphs. In this work, we show that making use of the past experience on semantically similar subgraphs can reduce the learning difficulty and promote the performance of KBQG models. To achieve this, we propose a novel approach to model diverse subgraphs with meta-learner (DSM). Specifically, we devise a graph contrastive learning-based retriever to identify semantically similar subgraphs, so that we can construct the semantics-aware learning tasks for the meta-learner to learn semantics-specific and semantics-agnostic knowledge on and across these tasks. Extensive experiments on two widely-adopted benchmarks for KBQG show that DSM derives new state-of-the-art performance and benefits the question answering tasks as a means of data augmentation.
Search