Prompting for Numerical Sequences: A Case Study on Market Comment Generation

Masayuki Kawarada, Tatsuya Ishigaki, Hiroya Takamura


Abstract
Large language models (LLMs) have been applied to a wide range of data-to-text generation tasks, including tables, graphs, and time-series numerical data-to-text settings. While research on generating prompts for structured data such as tables and graphs is gaining momentum, in-depth investigations into prompting for time-series numerical data are lacking. Therefore, this study explores various input representations, including sequences of tokens and structured formats such as HTML, LaTeX, and Python-style codes. In our experiments, we focus on the task of Market Comment Generation, which involves taking a numerical sequence of stock prices as input and generating a corresponding market comment. Contrary to our expectations, the results show that prompts resembling programming languages yield better outcomes, whereas those similar to natural languages and longer formats, such as HTML and LaTeX, are less effective. Our findings offer insights into creating effective prompts for tasks that generate text from numerical sequences.
Anthology ID:
2024.lrec-main.1155
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
13190–13200
Language:
URL:
https://aclanthology.org/2024.lrec-main.1155
DOI:
Bibkey:
Cite (ACL):
Masayuki Kawarada, Tatsuya Ishigaki, and Hiroya Takamura. 2024. Prompting for Numerical Sequences: A Case Study on Market Comment Generation. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 13190–13200, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Prompting for Numerical Sequences: A Case Study on Market Comment Generation (Kawarada et al., LREC-COLING 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2024.lrec-main.1155.pdf