SOI Matters: Analyzing Multi-Setting Training Dynamics in Pretrained Language Models via Subsets of Interest
Shayan Vassef, Amirhossein Dabiriaghdam, Mohammadreza Bakhtiari, Yadollah Yaghoobzadeh
Abstract
This work investigates the impact of multi-task, multi-lingual, and multi-source learning approaches on the robustness and performance of pretrained language models. To enhance this analysis, we introduce Subsets of Interest (SOI), a novel categorization framework that identifies six distinct learning behavior patterns during training, including forgettable examples, unlearned examples, and always correct examples. Through SOI transition heatmaps and dataset cartography visualization, we analyze how examples shift between these categories when transitioning from single-setting to multi-setting configurations. We perform comprehensive experiments across three parallel comparisons: multi-task vs. single-task learning using English tasks (entailment, paraphrase, sentiment), multi-source vs. single-source learning using sentiment analysis datasets, and multi-lingual vs. single-lingual learning using intent classification in French, English, and Persian. Our results demonstrate that multi-source learning consistently improves out-of-distribution performance by up to 7%, while multi-task learning shows mixed results with notable gains in similar task combinations. We further introduce a two-stage fine-tuning approach where the second stage leverages SOI-based subset selection to achieve additional performance improvements. These findings provide new insights into training dynamics and offer practical approaches for optimizing multi-setting language model performance.- Anthology ID:
- 2025.mrl-main.21
- Volume:
- Proceedings of the 5th Workshop on Multilingual Representation Learning (MRL 2025)
- Month:
- November
- Year:
- 2025
- Address:
- Suzhuo, China
- Editors:
- David Ifeoluwa Adelani, Catherine Arnett, Duygu Ataman, Tyler A. Chang, Hila Gonen, Rahul Raja, Fabian Schmidt, David Stap, Jiayi Wang
- Venues:
- MRL | WS
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 322–335
- Language:
- URL:
- https://preview.aclanthology.org/ingest-emnlp/2025.mrl-main.21/
- DOI:
- Cite (ACL):
- Shayan Vassef, Amirhossein Dabiriaghdam, Mohammadreza Bakhtiari, and Yadollah Yaghoobzadeh. 2025. SOI Matters: Analyzing Multi-Setting Training Dynamics in Pretrained Language Models via Subsets of Interest. In Proceedings of the 5th Workshop on Multilingual Representation Learning (MRL 2025), pages 322–335, Suzhuo, China. Association for Computational Linguistics.
- Cite (Informal):
- SOI Matters: Analyzing Multi-Setting Training Dynamics in Pretrained Language Models via Subsets of Interest (Vassef et al., MRL 2025)
- PDF:
- https://preview.aclanthology.org/ingest-emnlp/2025.mrl-main.21.pdf