Stan Peshterliev
2022
UniK-QA: Unified Representations of Structured and Unstructured Knowledge for Open-Domain Question Answering
Barlas Oguz
|
Xilun Chen
|
Vladimir Karpukhin
|
Stan Peshterliev
|
Dmytro Okhonko
|
Michael Schlichtkrull
|
Sonal Gupta
|
Yashar Mehdad
|
Scott Yih
Findings of the Association for Computational Linguistics: NAACL 2022
We study open-domain question answering with structured, unstructured and semi-structured knowledge sources, including text, tables, lists and knowledge bases. Departing from prior work, we propose a unifying approach that homogenizes all sources by reducing them to text and applies the retriever-reader model which has so far been limited to text sources only. Our approach greatly improves the results on knowledge-base QA tasks by 11 points, compared to latest graph-based methods. More importantly, we demonstrate that our unified knowledge (UniK-QA) model is a simple and yet effective way to combine heterogeneous sources of knowledge, advancing the state-of-the-art results on two popular question answering benchmarks, NaturalQuestions and WebQuestions, by 3.5 and 2.6 points, respectively.The code of UniK-QA is available at: https://github.com/facebookresearch/UniK-QA.
Search
Co-authors
- Barlas Oguz 1
- Xilun Chen 1
- Vladimir Karpukhin 1
- Dmytro Okhonko 1
- Michael Schlichtkrull 1
- show all...