SubmissionNumber#=%=#41 FinalPaperTitle#=%=#NCL_NLP at SemEval-2024 Task 7: CoT-NumHG: A CoT-Based SFT Training Strategy with Large Language Models for Number-Focused Headline Generation ShortPaperTitle#=%=# NumberOfPages#=%=#9 CopyrightSigned#=%=#Junzhe Zhao JobTitle#==# Organization#==# Abstract#==#Headline Generation is an essential task in Natural Language Processing (NLP), where models often exhibit limited ability to accurately interpret numerals, leading to inaccuracies in generated headlines. This paper introduces CoT-NumHG, a training strategy leveraging the Chain of Thought (CoT) paradigm for Supervised Fine-Tuning (SFT) of large language models. This approach is aimed at enhancing numeral perception, interpretability, accuracy, and the generation of structured outputs. Presented in SemEval-2024 Task 7 (task 3): Numeral-Aware Headline Generation (English), this challenge is divided into two specific subtasks. The first subtask focuses on numerical reasoning, requiring models to precisely calculate and fill in the missing numbers in news headlines, while the second subtask targets the generation of complete headlines. Utilizing the same training strategy across both subtasks, this study primarily explores the first subtask as a demonstration of our training strategy. Through this competition, our CoT-NumHG-Mistral-7B model attained an accuracy rate of 94%, underscoring the effectiveness of our proposed strategy. Author{1}{Firstname}#=%=#Junzhe Author{1}{Lastname}#=%=#Zhao Author{1}{Email}#=%=#zhaojunzhe_bit@163.com Author{1}{Affiliation}#=%=#Hangzhou Zero Matrix Intelligence Co., Ltd, China Author{2}{Firstname}#=%=#Yingxi Author{2}{Lastname}#=%=#Wang Author{2}{Email}#=%=#wangyingxiclaire@163.com Author{2}{Affiliation}#=%=#Huawei Technologies Co., Ltd., China Author{3}{Firstname}#=%=#Huizhi Author{3}{Lastname}#=%=#Liang Author{3}{Email}#=%=#huizhi.liang@newcastle.ac.uk Author{3}{Affiliation}#=%=#Newcastle University Author{4}{Firstname}#=%=#Nicolay Author{4}{Lastname}#=%=#Rusnachenko Author{4}{Email}#=%=#rusnicolay@gmail.com Author{4}{Affiliation}#=%=#Newcastle University ========== èéáğö