Yu Feng
2021
A Pretraining Numerical Reasoning Model for Ordinal Constrained Question Answering on Knowledge Base
Yu Feng
|
Jing Zhang
|
Gaole He
|
Wayne Xin Zhao
|
Lemao Liu
|
Quan Liu
|
Cuiping Li
|
Hong Chen
Findings of the Association for Computational Linguistics: EMNLP 2021
Knowledge Base Question Answering (KBQA) is to answer natural language questions posed over knowledge bases (KBs). This paper targets at empowering the IR-based KBQA models with the ability of numerical reasoning for answering ordinal constrained questions. A major challenge is the lack of explicit annotations about numerical properties. To address this challenge, we propose a pretraining numerical reasoning model consisting of NumGNN and NumTransformer, guided by explicit self-supervision signals. The two modules are pretrained to encode the magnitude and ordinal properties of numbers respectively and can serve as model-agnostic plugins for any IR-based KBQA model to enhance its numerical reasoning ability. Extensive experiments on two KBQA benchmarks verify the effectiveness of our method to enhance the numerical reasoning ability for IR-based KBQA models.
Search
Co-authors
- Jing Zhang 1
- Gaole He 1
- Wayne Xin Zhao 1
- Lemao Liu 1
- Quan Liu 1
- show all...