Conformal Nucleus Sampling

Shauli Ravfogel, Yoav Goldberg, Jacob Goldberger


Abstract
Language models generate text based on successively sampling the next word. A decoding procedure based on nucleus (top-p) sampling chooses from the smallest possible set of words whose cumulative probability exceeds the probability p. In this work, we assess whether a top-p set is indeed aligned with its probabilistic meaning in various linguistic contexts.We employ conformal prediction, a calibration procedure that focuses on the construction of minimal prediction sets according to a desired confidence level, to calibrate the parameter p as a function of the entropy of the next word distribution. We find that OPT models are overconfident, and that calibration shows a moderate inverse scaling with model size.
Anthology ID:
2023.findings-acl.3
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
27–34
Language:
URL:
https://aclanthology.org/2023.findings-acl.3
DOI:
10.18653/v1/2023.findings-acl.3
Bibkey:
Cite (ACL):
Shauli Ravfogel, Yoav Goldberg, and Jacob Goldberger. 2023. Conformal Nucleus Sampling. In Findings of the Association for Computational Linguistics: ACL 2023, pages 27–34, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Conformal Nucleus Sampling (Ravfogel et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/2023.findings-acl.3.pdf