JsonTuning: Towards Generalizable, Robust, and Controllable Instruction Tuning

Chang Gao, Wenxuan Zhang, Guizhen Chen, Wai Lam


Abstract
Instruction tuning is vital for enhancing the performance of large language models (LLMs), but existing text-to-text methods, referred to as TextTuning, struggle with issues such as generalization, robustness, and controllability due to their lack of explicit task structures. We introduce JsonTuning, a structure-to-structure approach that uses JSON structures to represent tasks. This method improves generalization by clarifying task elements and their relations, boosts robustness by minimizing ambiguity, and enhances controllability by allowing precise control over outputs. We conduct an extensive comparative analysis between JsonTuning and TextTuning using various language models and benchmarks. Our findings reveal that JsonTuning consistently surpasses TextTuning in terms of performance, robustness, and controllability across different scenarios. By overcoming the limitations of TextTuning, JsonTuning demonstrates significant potential for developing more effective and reliable LLMs capable of handling diverse scenarios.
Anthology ID:
2025.findings-acl.1232
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
24029–24055
Language:
URL:
https://preview.aclanthology.org/mtsummit-25-ingestion/2025.findings-acl.1232/
DOI:
10.18653/v1/2025.findings-acl.1232
Bibkey:
Cite (ACL):
Chang Gao, Wenxuan Zhang, Guizhen Chen, and Wai Lam. 2025. JsonTuning: Towards Generalizable, Robust, and Controllable Instruction Tuning. In Findings of the Association for Computational Linguistics: ACL 2025, pages 24029–24055, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
JsonTuning: Towards Generalizable, Robust, and Controllable Instruction Tuning (Gao et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/mtsummit-25-ingestion/2025.findings-acl.1232.pdf