When and Why a Model Fails? A Human-in-the-loop Error Detection Framework for Sentiment Analysis

Zhe Liu, Yufan Guo, Jalal Mahmud


Abstract
Although deep neural networks have been widely employed and proven effective in sentiment analysis tasks, it remains challenging for model developers to assess their models for erroneous predictions that might exist prior to deployment. Once deployed, emergent errors can be hard to identify in prediction run-time and impossible to trace back to their sources. To address such gaps, in this paper we propose an error detection framework for sentiment analysis based on explainable features. We perform global-level feature validation with human-in-the-loop assessment, followed by an integration of global and local-level feature contribution analysis. Experimental results show that, given limited human-in-the-loop intervention, our method is able to identify erroneous model predictions on unseen data with high precision.
Anthology ID:
2021.naacl-industry.22
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Papers
Month:
June
Year:
2021
Address:
Online
Editors:
Young-bum Kim, Yunyao Li, Owen Rambow
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
170–177
Language:
URL:
https://aclanthology.org/2021.naacl-industry.22
DOI:
10.18653/v1/2021.naacl-industry.22
Bibkey:
Cite (ACL):
Zhe Liu, Yufan Guo, and Jalal Mahmud. 2021. When and Why a Model Fails? A Human-in-the-loop Error Detection Framework for Sentiment Analysis. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Papers, pages 170–177, Online. Association for Computational Linguistics.
Cite (Informal):
When and Why a Model Fails? A Human-in-the-loop Error Detection Framework for Sentiment Analysis (Liu et al., NAACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2021.naacl-industry.22.pdf
Video:
 https://preview.aclanthology.org/naacl-24-ws-corrections/2021.naacl-industry.22.mp4