DeepBlues@LT-EDI-ACL2022: Depression level detection modelling through domain specific BERT and short text Depression classifiers

Nawshad Farruque, Osmar Zaiane, Randy Goebel, Sudhakar Sivapalan


Abstract
We discuss a variety of approaches to build a robust Depression level detection model from longer social media posts (i.e., Reddit Depression forum posts) using a mental health text pre-trained BERT model. Further, we report our experimental results based on a strategy to select excerpts from long text and then fine-tune the BERT model to combat the issue of memory constraints while processing such texts. We show that, with domain specific BERT, we can achieve reasonable accuracy with fixed text size (in this case 200 tokens) for this task. In addition we can use short text classifiers to extract relevant text from the long text and achieve slightly better accuracy, albeit, trading off with the processing time for extracting such excerpts.
Anthology ID:
2022.ltedi-1.21
Volume:
Proceedings of the Second Workshop on Language Technology for Equality, Diversity and Inclusion
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Bharathi Raja Chakravarthi, B Bharathi, John P McCrae, Manel Zarrouk, Kalika Bali, Paul Buitelaar
Venue:
LTEDI
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
167–171
Language:
URL:
https://aclanthology.org/2022.ltedi-1.21
DOI:
10.18653/v1/2022.ltedi-1.21
Bibkey:
Cite (ACL):
Nawshad Farruque, Osmar Zaiane, Randy Goebel, and Sudhakar Sivapalan. 2022. DeepBlues@LT-EDI-ACL2022: Depression level detection modelling through domain specific BERT and short text Depression classifiers. In Proceedings of the Second Workshop on Language Technology for Equality, Diversity and Inclusion, pages 167–171, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
DeepBlues@LT-EDI-ACL2022: Depression level detection modelling through domain specific BERT and short text Depression classifiers (Farruque et al., LTEDI 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2022.ltedi-1.21.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-2/2022.ltedi-1.21.mp4