Open-Domain Question Answering

Danqi Chen, Wen-tau Yih


Abstract
This tutorial provides a comprehensive and coherent overview of cutting-edge research in open-domain question answering (QA), the task of answering questions using a large collection of documents of diversified topics. We will start by first giving a brief historical background, discussing the basic setup and core technical challenges of the research problem, and then describe modern datasets with the common evaluation metrics and benchmarks. The focus will then shift to cutting-edge models proposed for open-domain QA, including two-stage retriever-reader approaches, dense retriever and end-to-end training, and retriever-free methods. Finally, we will cover some hybrid approaches using both text and large knowledge bases and conclude the tutorial with important open questions. We hope that the tutorial will not only help the audience to acquire up-to-date knowledge but also provide new perspectives to stimulate the advances of open-domain QA research in the next phase.
Anthology ID:
2020.acl-tutorials.8
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Tutorial Abstracts
Month:
July
Year:
2020
Address:
Online
Editors:
Agata Savary, Yue Zhang
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
34–37
Language:
URL:
https://aclanthology.org/2020.acl-tutorials.8
DOI:
10.18653/v1/2020.acl-tutorials.8
Bibkey:
Cite (ACL):
Danqi Chen and Wen-tau Yih. 2020. Open-Domain Question Answering. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Tutorial Abstracts, pages 34–37, Online. Association for Computational Linguistics.
Cite (Informal):
Open-Domain Question Answering (Chen & Yih, ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-bitext-workshop/2020.acl-tutorials.8.pdf