Comparing a BERT Classifier and a GPT classifier for Detecting Connective Language Across Multiple Social Media
Josephine Lukito, Bin Chen, Gina M. Masullo, Natalie Jomini Stroud
Abstract
This study presents an approach for detecting connective language—defined as language that facilitates engagement, understanding, and conversation—from social media discussions. We developed and evaluated two types of classifiers: BERT and GPT-3.5 turbo. Our results demonstrate that the BERT classifier significantly outperforms GPT-3.5 turbo in detecting connective language. Furthermore, our analysis confirms that connective language is distinct from related concepts measuring discourse qualities, such as politeness and toxicity. We also explore the potential of BERT-based classifiers for platform-agnostic tools. This research advances our understanding of the linguistic dimensions of online communication and proposes practical tools for detecting connective language across diverse digital environments.- Anthology ID:
- 2024.emnlp-main.1067
- Volume:
- Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2024
- Address:
- Miami, Florida, USA
- Editors:
- Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 19140–19153
- Language:
- URL:
- https://preview.aclanthology.org/fix-sig-urls/2024.emnlp-main.1067/
- DOI:
- 10.18653/v1/2024.emnlp-main.1067
- Cite (ACL):
- Josephine Lukito, Bin Chen, Gina M. Masullo, and Natalie Jomini Stroud. 2024. Comparing a BERT Classifier and a GPT classifier for Detecting Connective Language Across Multiple Social Media. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 19140–19153, Miami, Florida, USA. Association for Computational Linguistics.
- Cite (Informal):
- Comparing a BERT Classifier and a GPT classifier for Detecting Connective Language Across Multiple Social Media (Lukito et al., EMNLP 2024)
- PDF:
- https://preview.aclanthology.org/fix-sig-urls/2024.emnlp-main.1067.pdf