On the Transformation of Latent Space in Fine-Tuned NLP Models

Nadir Durrani, Hassan Sajjad, Fahim Dalvi, Firoj Alam


Abstract
We study the evolution of latent space in fine-tuned NLP models. Different from the commonly used probing-framework, we opt for an unsupervised method to analyze representations. More specifically, we discover latent concepts in the representational space using hierarchical clustering. We then use an alignment function to gauge the similarity between the latent space of a pre-trained model and its fine-tuned version. We use traditional linguistic concepts to facilitate our understanding and also study how the model space transforms towards task-specific information. We perform a thorough analysis, comparing pre-trained and fine-tuned models across three models and three downstream tasks. The notable findings of our work are: i) the latent space of the higher layers evolve towards task-specific concepts, ii) whereas the lower layers retain generic concepts acquired in the pre-trained model, iii) we discovered that some concepts in the higher layers acquire polarity towards the output class, and iv) that these concepts can be used for generating adversarial triggers.
Anthology ID:
2022.emnlp-main.97
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1495–1516
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2022.emnlp-main.97/
DOI:
10.18653/v1/2022.emnlp-main.97
Bibkey:
Cite (ACL):
Nadir Durrani, Hassan Sajjad, Fahim Dalvi, and Firoj Alam. 2022. On the Transformation of Latent Space in Fine-Tuned NLP Models. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 1495–1516, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
On the Transformation of Latent Space in Fine-Tuned NLP Models (Durrani et al., EMNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2022.emnlp-main.97.pdf