Abstract
Recently, with the help of deep learning models, significant advances have been made in different Natural Language Processing (NLP) tasks. Unfortunately, state-of-the-art models are vulnerable to noisy texts. We propose a new contextual text denoising algorithm based on the ready-to-use masked language model. The proposed algorithm does not require retraining of the model and can be integrated into any NLP system without additional training on paired cleaning training data. We evaluate our method under synthetic noise and natural noise and show that the proposed algorithm can use context information to correct noise text and improve the performance of noisy inputs in several downstream tasks.- Anthology ID:
- D19-5537
- Volume:
- Proceedings of the 5th Workshop on Noisy User-generated Text (W-NUT 2019)
- Month:
- November
- Year:
- 2019
- Address:
- Hong Kong, China
- Venue:
- WNUT
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 286–290
- Language:
- URL:
- https://aclanthology.org/D19-5537
- DOI:
- 10.18653/v1/D19-5537
- Cite (ACL):
- Yifu Sun and Haoming Jiang. 2019. Contextual Text Denoising with Masked Language Model. In Proceedings of the 5th Workshop on Noisy User-generated Text (W-NUT 2019), pages 286–290, Hong Kong, China. Association for Computational Linguistics.
- Cite (Informal):
- Contextual Text Denoising with Masked Language Model (Sun & Jiang, WNUT 2019)
- PDF:
- https://preview.aclanthology.org/paclic-22-ingestion/D19-5537.pdf