In-Context Learning of Soft Nearest Neighbor Classifiers for Intelligible Tabular Machine Learning

Mykhailo Koshil, Matthias Feurer, Katharina Eggensperger


Abstract
With in-context learning foundation models like TabPFN excelling on small supervised tabular learning tasks, it has been argued that “boosted trees are not the best default choice when working with data in tables”. However, such foundation models are inherently black-box models that do not provide interpretable predictions. We introduce a novel learning task to train ICL models to act as a nearest neighbor algorithm, which enables intelligible inference and does not decrease performance empirically.
Anthology ID:
2025.trl-workshop.15
Volume:
Proceedings of the 4th Table Representation Learning Workshop
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Shuaichen Chang, Madelon Hulsebos, Qian Liu, Wenhu Chen, Huan Sun
Venues:
TRL | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
182–191
Language:
URL:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.trl-workshop.15/
DOI:
Bibkey:
Cite (ACL):
Mykhailo Koshil, Matthias Feurer, and Katharina Eggensperger. 2025. In-Context Learning of Soft Nearest Neighbor Classifiers for Intelligible Tabular Machine Learning. In Proceedings of the 4th Table Representation Learning Workshop, pages 182–191, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
In-Context Learning of Soft Nearest Neighbor Classifiers for Intelligible Tabular Machine Learning (Koshil et al., TRL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.trl-workshop.15.pdf