Abstract
Off-topic spoken response detection, the task aiming at predicting whether a response is off-topic for the corresponding prompt, is important for an automated speaking assessment system. In many real-world educational applications, off-topic spoken response detectors are required to achieve high recall for off-topic responses not only on seen prompts but also on prompts that are unseen during training. In this paper, we propose a novel approach for off-topic spoken response detection with high off-topic recall on both seen and unseen prompts. We introduce a new model, Gated Convolutional Bidirectional Attention-based Model (GCBiA), which applies bi-attention mechanism and convolutions to extract topic words of prompts and key-phrases of responses, and introduces gated unit and residual connections between major layers to better represent the relevance of responses and prompts. Moreover, a new negative sampling method is proposed to augment training data. Experiment results demonstrate that our novel approach can achieve significant improvements in detecting off-topic responses with extremely high on-topic recall, for both seen and unseen prompts.- Anthology ID:
- 2020.acl-main.56
- Volume:
- Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
- Month:
- July
- Year:
- 2020
- Address:
- Online
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 600–608
- Language:
- URL:
- https://aclanthology.org/2020.acl-main.56
- DOI:
- 10.18653/v1/2020.acl-main.56
- Cite (ACL):
- Yefei Zha, Ruobing Li, and Hui Lin. 2020. Gated Convolutional Bidirectional Attention-based Model for Off-topic Spoken Response Detection. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 600–608, Online. Association for Computational Linguistics.
- Cite (Informal):
- Gated Convolutional Bidirectional Attention-based Model for Off-topic Spoken Response Detection (Zha et al., ACL 2020)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/2020.acl-main.56.pdf