Fill In The Gaps: Model Calibration and Generalization with Synthetic Data

Yang Ba, Michelle V Mancenido, Rong Pan


Abstract
As machine learning models continue to swiftly advance, calibrating their performance has become a major concern prior to practical and widespread implementation. Most existing calibration methods often negatively impact model accuracy due to the lack of diversity of validation data, resulting in reduced generalizability. To address this, we propose a calibration method that incorporates synthetic data without compromising accuracy. We derive the expected calibration error (ECE) bound using the Probably Approximately Correct (PAC) learning framework. Large language models (LLMs), known for their ability to mimic real data and generate text with mixed class labels, are utilized as a synthetic data generation strategy to lower the ECE bound and improve model accuracy on real test data. Additionally, we propose data generation mechanisms for efficient calibration. Testing our method on four different natural language processing tasks, we observed an average up to 34% increase in accuracy and 33% decrease in ECE.
Anthology ID:
2024.emnlp-main.955
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
17211–17225
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2024.emnlp-main.955/
DOI:
10.18653/v1/2024.emnlp-main.955
Bibkey:
Cite (ACL):
Yang Ba, Michelle V Mancenido, and Rong Pan. 2024. Fill In The Gaps: Model Calibration and Generalization with Synthetic Data. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 17211–17225, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Fill In The Gaps: Model Calibration and Generalization with Synthetic Data (Ba et al., EMNLP 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2024.emnlp-main.955.pdf