Arjan Van Ewijk
2025
ReGraph: Learning to Reformulate Graph Encodings with Large Language Models
Amir Hadifar
|
Christopher Ochs
|
Arjan Van Ewijk
Proceedings of the 14th International Joint Conference on Natural Language Processing and the 4th Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics
Large language models can rephrase and restructure natural language effectively, but their potential for reformulating graph encodings remains underexplored despite the significant impact of encoding choices on performance.In this work, we introduce ReGraph, a reinforcement learning-based approach that guides language models to reformulate graph encodings for improved task alignment.We demonstrate that reformulating graph encodings enhances reasoning and yields consistent performance gains on graph question answering tasks.Our results show that language models often prefer specific graph encodings, even if they are suboptimal for the task at hand.