Instruction Tuning with Human Curriculum

Bruce W Lee, Hyunsoo Cho, Kang Min Yoo


Abstract
In this work, we (1) introduce Curriculum Instruction Tuning, (2) explore the potential advantages of employing diverse curriculum strategies, and (3) delineate a synthetic instruction-response generation framework that complements our theoretical approach. Distinct from the existing instruction tuning dataset, our generation pipeline is systematically structured to emulate the sequential and orderly characteristic of human learning. Additionally, we describe a methodology for generating instruction-response datasets that extensively span the various stages of human education, from middle school through the graduate level, utilizing educational subject catalogs.Before training, we meticulously organize the instruction data to ensure that questions escalate in difficulty regarding (A) the subject matter and (B) the intricacy of the instructions. The findings of our study reveal that substantial improvements in performance can be achieved through the mere application of curriculum ordering to instruction data—achieving gains of +4.76 on TruthfulQA, +2.98 on MMLU, +2.8 on OpenbookQA, and +1.28 on ARC-hard—compared to random shuffling. This enhancement is achieved without incurring additional computational expenses. Through comprehensive experimentation, we observe that the advantages of our proposed method are consistently evident across nine benchmarks.
Anthology ID:
2024.findings-naacl.82
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1281–1309
Language:
URL:
https://aclanthology.org/2024.findings-naacl.82
DOI:
10.18653/v1/2024.findings-naacl.82
Bibkey:
Cite (ACL):
Bruce W Lee, Hyunsoo Cho, and Kang Min Yoo. 2024. Instruction Tuning with Human Curriculum. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 1281–1309, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Instruction Tuning with Human Curriculum (Lee et al., Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2024.findings-naacl.82.pdf