Abstract
Text preprocessing is often the first step in the pipeline of a Natural Language Processing (NLP) system, with potential impact in its final performance. Despite its importance, text preprocessing has not received much attention in the deep learning literature. In this paper we investigate the impact of simple text preprocessing decisions (particularly tokenizing, lemmatizing, lowercasing and multiword grouping) on the performance of a standard neural text classifier. We perform an extensive evaluation on standard benchmarks from text categorization and sentiment analysis. While our experiments show that a simple tokenization of input text is generally adequate, they also highlight significant degrees of variability across preprocessing techniques. This reveals the importance of paying attention to this usually-overlooked step in the pipeline, particularly when comparing different models. Finally, our evaluation provides insights into the best preprocessing practices for training word embeddings.- Anthology ID:
- W18-5406
- Volume:
- Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP
- Month:
- November
- Year:
- 2018
- Address:
- Brussels, Belgium
- Editors:
- Tal Linzen, Grzegorz Chrupała, Afra Alishahi
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 40–46
- Language:
- URL:
- https://aclanthology.org/W18-5406
- DOI:
- 10.18653/v1/W18-5406
- Cite (ACL):
- Jose Camacho-Collados and Mohammad Taher Pilehvar. 2018. On the Role of Text Preprocessing in Neural Network Architectures: An Evaluation Study on Text Categorization and Sentiment Analysis. In Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP, pages 40–46, Brussels, Belgium. Association for Computational Linguistics.
- Cite (Informal):
- On the Role of Text Preprocessing in Neural Network Architectures: An Evaluation Study on Text Categorization and Sentiment Analysis (Camacho-Collados & Pilehvar, EMNLP 2018)
- PDF:
- https://preview.aclanthology.org/improve-issue-templates/W18-5406.pdf
- Code
- pedrada88/preproc-textclassification + additional community code
- Data
- IMDb Movie Reviews, Ohsumed, SST, SST-2