Abstract
Short text classification (STC) is hard as short texts lack context information and labeled data is not enough. Graph neural networks obtain the state-of-the-art on STC since they can merge various auxiliary information via the message passing framework. However, existing works conduct transductive learning, which requires retraining to accommodate new samples and takes large memory. In this paper, we present SimpleSTC which handles inductive STC problem but only leverages words. We construct word graph from an external large corpus to compensate for the lack of semantic information, and learn text graph to handle the lack of labeled data. Results show that SimpleSTC obtains state-of-the-art performance with lower memory consumption and faster inference speed.- Anthology ID:
- 2022.emnlp-main.735
- Volume:
- Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
- Month:
- December
- Year:
- 2022
- Address:
- Abu Dhabi, United Arab Emirates
- Editors:
- Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 10717–10724
- Language:
- URL:
- https://aclanthology.org/2022.emnlp-main.735
- DOI:
- 10.18653/v1/2022.emnlp-main.735
- Cite (ACL):
- Kaixin Zheng, Yaqing Wang, Quanming Yao, and Dejing Dou. 2022. Simplified Graph Learning for Inductive Short Text Classification. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 10717–10724, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
- Cite (Informal):
- Simplified Graph Learning for Inductive Short Text Classification (Zheng et al., EMNLP 2022)
- PDF:
- https://preview.aclanthology.org/bionlp-24-ingestion/2022.emnlp-main.735.pdf