FQuAD: French Question Answering Dataset
Martin d’Hoffschmidt, Wacim Belblidia, Quentin Heinrich, Tom Brendlé, Maxime Vidal
Abstract
Recent advances in the field of language modeling have improved state-of-the-art results on many Natural Language Processing tasks. Among them, Reading Comprehension has made significant progress over the past few years. However, most results are reported in English since labeled resources available in other languages, such as French, remain scarce. In the present work, we introduce the French Question Answering Dataset (FQuAD). FQuAD is a French Native Reading Comprehension dataset of questions and answers on a set of Wikipedia articles that consists of 25,000+ samples for the 1.0 version and 60,000+ samples for the 1.1 version. We train a baseline model which achieves an F1 score of 92.2 and an exact match ratio of 82.1 on the test set. In an effort to track the progress of French Question Answering models we propose a leaderboard and we have made the 1.0 version of our dataset freely available at https://illuin-tech.github.io/FQuAD-explorer/.- Anthology ID:
- 2020.findings-emnlp.107
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2020
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Editors:
- Trevor Cohn, Yulan He, Yang Liu
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1193–1208
- Language:
- URL:
- https://aclanthology.org/2020.findings-emnlp.107
- DOI:
- 10.18653/v1/2020.findings-emnlp.107
- Cite (ACL):
- Martin d’Hoffschmidt, Wacim Belblidia, Quentin Heinrich, Tom Brendlé, and Maxime Vidal. 2020. FQuAD: French Question Answering Dataset. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 1193–1208, Online. Association for Computational Linguistics.
- Cite (Informal):
- FQuAD: French Question Answering Dataset (d’Hoffschmidt et al., Findings 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2020.findings-emnlp.107.pdf
- Data
- FQuAD, CoQA, HotpotQA, KorQuAD, MLQA, NewsQA, QuAC, SQuAD, SberQuAD, XQuAD