Mintaka: A Complex, Natural, and Multilingual Dataset for End-to-End Question Answering

Priyanka Sen, Alham Fikri Aji, Amir Saffari


Abstract
We introduce Mintaka, a complex, natural, and multilingual dataset designed for experimenting with end-to-end question-answering models. Mintaka is composed of 20,000 question-answer pairs collected in English, annotated with Wikidata entities, and translated into Arabic, French, German, Hindi, Italian, Japanese, Portuguese, and Spanish for a total of 180,000 samples. Mintaka includes 8 types of complex questions, including superlative, intersection, and multi-hop questions, which were naturally elicited from crowd workers. We run baselines over Mintaka, the best of which achieves 38% hits@1 in English and 31% hits@1 multilingually, showing that existing models have room for improvement. We release Mintaka at https://github.com/amazon-research/mintaka.
Anthology ID:
2022.coling-1.138
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
1604–1619
Language:
URL:
https://aclanthology.org/2022.coling-1.138
DOI:
Bibkey:
Cite (ACL):
Priyanka Sen, Alham Fikri Aji, and Amir Saffari. 2022. Mintaka: A Complex, Natural, and Multilingual Dataset for End-to-End Question Answering. In Proceedings of the 29th International Conference on Computational Linguistics, pages 1604–1619, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Mintaka: A Complex, Natural, and Multilingual Dataset for End-to-End Question Answering (Sen et al., COLING 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2022.coling-1.138.pdf
Code
 amazon-research/mintaka
Data
MINTAKAComplexWebQuestionsDROPHotpotQANatural QuestionsSQuADSimpleQuestionsWebQuestionsWebQuestionsSPXQuAD