Teaching Large Language Models Number-Focused Headline Generation With Key Element Rationales

Zhen Qian, Xiuzhen Zhang, Xiaofei Xu, Feng Xia


Abstract
Number-focused headline generation is a summarization task requiring both high textual quality and precise numerical accuracy, which poses a unique challenge for Large Language Models (LLMs). Existing studies in the literature focus only on either textual quality or numerical reasoning and thus are inadequate to address this challenge. In this paper, we propose a novel chain-of-thought framework for using rationales comprising key elements of the Topic, Entities, and Numerical reasoning (TEN) in news articles to enhance the capability for LLMs to generate topic-aligned high-quality texts with precise numerical accuracy. Specifically, a teacher LLM is employed to generate TEN rationales as supervision data, which are then used to teach and fine-tune a student LLM. Our approach teaches the student LLM automatic generation of rationales with enhanced capability for numerical reasoning and topic-aligned numerical headline generation. Experiments show that our approach achieves superior performance in both textual quality and numerical accuracy.
Anthology ID:
2025.findings-naacl.33
Volume:
Findings of the Association for Computational Linguistics: NAACL 2025
Month:
April
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Luis Chiruzzo, Alan Ritter, Lu Wang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
533–550
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.findings-naacl.33/
DOI:
Bibkey:
Cite (ACL):
Zhen Qian, Xiuzhen Zhang, Xiaofei Xu, and Feng Xia. 2025. Teaching Large Language Models Number-Focused Headline Generation With Key Element Rationales. In Findings of the Association for Computational Linguistics: NAACL 2025, pages 533–550, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
Teaching Large Language Models Number-Focused Headline Generation With Key Element Rationales (Qian et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.findings-naacl.33.pdf