Abstract
Most existing approaches to disfluency detection heavily rely on human-annotated corpora, which is expensive to obtain in practice. There have been several proposals to alleviate this issue with, for instance, self-supervised learning techniques, but they still require human-annotated corpora. In this work, we explore the unsupervised learning paradigm which can potentially work with unlabeled text corpora that are cheaper and easier to obtain. Our model builds upon the recent work on Noisy Student Training, a semi-supervised learning approach that extends the idea of self-training. Experimental results on the commonly used English Switchboard test set show that our approach achieves competitive performance compared to the previous state-of-the-art supervised systems using contextualized word embeddings (e.g. BERT and ELECTRA).- Anthology ID:
- 2020.emnlp-main.142
- Volume:
- Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Editors:
- Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1813–1822
- Language:
- URL:
- https://aclanthology.org/2020.emnlp-main.142
- DOI:
- 10.18653/v1/2020.emnlp-main.142
- Cite (ACL):
- Shaolei Wang, Zhongyuan Wang, Wanxiang Che, and Ting Liu. 2020. Combining Self-Training and Self-Supervised Learning for Unsupervised Disfluency Detection. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 1813–1822, Online. Association for Computational Linguistics.
- Cite (Informal):
- Combining Self-Training and Self-Supervised Learning for Unsupervised Disfluency Detection (Wang et al., EMNLP 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2020.emnlp-main.142.pdf
- Code
- scir-zywang/self-training-self-supervised-disfluency