Language Models, Graph Searching, and Supervision Adulteration: When More Supervision is Less and How to Make More More

Arvid Frydenlund


Abstract
This work concerns the path-star task, a minimal example of searching over a graph. The graph, G, is star-shaped with D arms radiating from a start node, s. A language model (LM) is given G, s, and a target node, t, which ends one of the arms and is tasked with generating the arm containing t. The minimal nature of this task means only a single choice needs to be made: which of the arms contains?Decoder-only LMs fail to solve this elementary task above 1/D chance due to a learned shortcut that absorbs training supervision. We show how this pathology is caused by excess supervision and present a series of solutions demonstrating that the task is solvable via decoder-only LMs. We find that the task’s minimal nature causes its difficulty, as it prevents task decomposition. Our solutions provide insight into the pathology and its implications for LMs trained via next-token prediction.
Anthology ID:
2025.acl-long.1409
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
29011–29059
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1409/
DOI:
Bibkey:
Cite (ACL):
Arvid Frydenlund. 2025. Language Models, Graph Searching, and Supervision Adulteration: When More Supervision is Less and How to Make More More. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 29011–29059, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Language Models, Graph Searching, and Supervision Adulteration: When More Supervision is Less and How to Make More More (Frydenlund, ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1409.pdf