Comprehensive Study of Bilingual and Multi-category Instruction Pre-training

Takashi Kodama, Yusuke Oda


Abstract
Instruction pre-training (IPT) has recently emerged as an effective intermediate stage between vanilla pre-training and post-training for large language models (LLMs). However, the optimal design of IPT corpora—such as the balance between raw and instruction-response data, languages, and task categories—remains unclear. We systematically study IPT corpus composition using a bilingual (English and Japanese) and multi-category (coding, general, math, and reasoning) instruction-response dataset. Through extensive IPT experiments across four base models, including both English-centric and bilingual LLMs, we find that: (1) more instruction-response data generally enhances model performance, particularly for models with large VPT budgets; (2) Japanese instruction data can improve English performance through cross-lingual transfer; and (3) the effectiveness of post-training varies across categories: coding performance is largely determined during IPT, while math and reasoning continue to improve during post-training.
Anthology ID:
2026.findings-eacl.68
Volume:
Findings of the Association for Computational Linguistics: EACL 2026
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1323–1340
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.68/
DOI:
Bibkey:
Cite (ACL):
Takashi Kodama and Yusuke Oda. 2026. Comprehensive Study of Bilingual and Multi-category Instruction Pre-training. In Findings of the Association for Computational Linguistics: EACL 2026, pages 1323–1340, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Comprehensive Study of Bilingual and Multi-category Instruction Pre-training (Kodama & Oda, Findings 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.68.pdf
Checklist:
 2026.findings-eacl.68.checklist.pdf