Evaluating Natural Language Understanding Services for Conversational Question Answering Systems
Daniel Braun, Adrian Hernandez Mendez, Florian Matthes, Manfred Langen
Abstract
Conversational interfaces recently gained a lot of attention. One of the reasons for the current hype is the fact that chatbots (one particularly popular form of conversational interfaces) nowadays can be created without any programming knowledge, thanks to different toolkits and so-called Natural Language Understanding (NLU) services. While these NLU services are already widely used in both, industry and science, so far, they have not been analysed systematically. In this paper, we present a method to evaluate the classification performance of NLU services. Moreover, we present two new corpora, one consisting of annotated questions and one consisting of annotated questions with the corresponding answers. Based on these corpora, we conduct an evaluation of some of the most popular NLU services. Thereby we want to enable both, researchers and companies to make more educated decisions about which service they should use.- Anthology ID:
- W17-5522
- Volume:
- Proceedings of the 18th Annual SIGdial Meeting on Discourse and Dialogue
- Month:
- August
- Year:
- 2017
- Address:
- Saarbrücken, Germany
- Editors:
- Kristiina Jokinen, Manfred Stede, David DeVault, Annie Louis
- Venue:
- SIGDIAL
- SIG:
- SIGDIAL
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 174–185
- Language:
- URL:
- https://aclanthology.org/W17-5522
- DOI:
- 10.18653/v1/W17-5522
- Cite (ACL):
- Daniel Braun, Adrian Hernandez Mendez, Florian Matthes, and Manfred Langen. 2017. Evaluating Natural Language Understanding Services for Conversational Question Answering Systems. In Proceedings of the 18th Annual SIGdial Meeting on Discourse and Dialogue, pages 174–185, Saarbrücken, Germany. Association for Computational Linguistics.
- Cite (Informal):
- Evaluating Natural Language Understanding Services for Conversational Question Answering Systems (Braun et al., SIGDIAL 2017)
- PDF:
- https://preview.aclanthology.org/naacl-24-ws-corrections/W17-5522.pdf
- Code
- sebischair/NLU-Evaluation-Scripts
- Data
- NLU Evaluation Corpora