On Training Data Influence of GPT Models

Yekun Chai, Qingyi Liu, Shuohuan Wang, Yu Sun, Qiwei Peng, Hua Wu


Abstract
Amidst the rapid advancements in generative language models, the investigation of how training data shapes the performance of GPT models is still emerging. This paper presents GPTfluence, a novel approach that leverages a featurized simulation to assess the impact of training examples on the training dynamics of GPT models. Our approach not only traces the influence of individual training instances on performance trajectories, such as loss and other key metrics, on targeted test points but also enables a comprehensive comparison with existing methods across various training scenarios in GPT models, ranging from 14 million to 2.8 billion parameters, across a range of downstream tasks. Contrary to earlier methods that struggle with generalization to new data, GPTfluence introduces a parameterized simulation of training dynamics, demonstrating robust generalization capabilities to unseen training data. This adaptability is evident across both fine-tuning and instruction-tuning scenarios, spanning tasks in natural language understanding and generation. We make our code and data publicly available at https://github.com/ernie-research/gptfluence.
Anthology ID:
2024.emnlp-main.183
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3126–3150
Language:
URL:
https://aclanthology.org/2024.emnlp-main.183
DOI:
10.18653/v1/2024.emnlp-main.183
Bibkey:
Cite (ACL):
Yekun Chai, Qingyi Liu, Shuohuan Wang, Yu Sun, Qiwei Peng, and Hua Wu. 2024. On Training Data Influence of GPT Models. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 3126–3150, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
On Training Data Influence of GPT Models (Chai et al., EMNLP 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2024.emnlp-main.183.pdf