Team ISM at CLPsych 2025: Capturing Mental Health Dynamics from Social Media Timelines using A Pretrained Large Language Model with In-Context Learning

Vu Tran, Tomoko Matsui


Abstract
We tackle the task by using a pretrained large language model (LLM) and in-context learning with template-based instructions to guide the LLM. To improve generation quality, we employ a two-step procedure: sampling and selection. For the sampling step, we randomly sample a subset of the provided training data for the context of LLM prompting. Next, for the selection step, we map the LLM generated outputs into a vector space and employ the Gaussian kernel density estimation to select the most likely output. The results show that the approach can achieve a certain degree of performance and there is still room for improvement.
Anthology ID:
2025.clpsych-1.25
Volume:
Proceedings of the 10th Workshop on Computational Linguistics and Clinical Psychology (CLPsych 2025)
Month:
May
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Ayah Zirikly, Andrew Yates, Bart Desmet, Molly Ireland, Steven Bedrick, Sean MacAvaney, Kfir Bar, Yaakov Ophir
Venues:
CLPsych | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
287–291
Language:
URL:
https://preview.aclanthology.org/corrections-2025-06/2025.clpsych-1.25/
DOI:
10.18653/v1/2025.clpsych-1.25
Bibkey:
Cite (ACL):
Vu Tran and Tomoko Matsui. 2025. Team ISM at CLPsych 2025: Capturing Mental Health Dynamics from Social Media Timelines using A Pretrained Large Language Model with In-Context Learning. In Proceedings of the 10th Workshop on Computational Linguistics and Clinical Psychology (CLPsych 2025), pages 287–291, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
Team ISM at CLPsych 2025: Capturing Mental Health Dynamics from Social Media Timelines using A Pretrained Large Language Model with In-Context Learning (Tran & Matsui, CLPsych 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/corrections-2025-06/2025.clpsych-1.25.pdf