Exploring Large Language Models for Detecting Mental Disorders

Gleb Kuzmin, Petr Strepetov, Maksim Stankevich, Natalia Chudova, Artem Shelmanov, Ivan Smirnov


Abstract
This paper compares the effectiveness of traditional machine learning methods, encoder-based models, and large language models (LLMs) on the task of detecting depression and anxiety. Five Russian-language datasets were considered, each differing in format and in the method used to define the target pathology class. We tested AutoML models based on linguistic features, several variations of encoder-based Transformers such as BERT, and state-of-the-art LLMs as pathology classification models. The results demonstrated that LLMs outperform traditional methods, particularly on noisy and small datasets where training examples vary significantly in text length and genre. However, psycholinguistic features and encoder-based models can achieve performance comparable to language models when trained on texts from individuals with clinically confirmed depression, highlighting their potential effectiveness in targeted clinical applications.
Anthology ID:
2025.emnlp-main.1752
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
34523–34547
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1752/
DOI:
Bibkey:
Cite (ACL):
Gleb Kuzmin, Petr Strepetov, Maksim Stankevich, Natalia Chudova, Artem Shelmanov, and Ivan Smirnov. 2025. Exploring Large Language Models for Detecting Mental Disorders. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 34523–34547, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Exploring Large Language Models for Detecting Mental Disorders (Kuzmin et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1752.pdf
Checklist:
 2025.emnlp-main.1752.checklist.pdf