Time-aware Prompting for Text Generation

Shuyang Cao, Lu Wang


Abstract
In this paper, we study the effects of incorporating timestamps, such as document creation dates, into generation systems. Two types of time-aware prompts are investigated: (1) textual prompts that encode document timestamps in natural language sentences; and (2) linear prompts that convert timestamps into continuous vectors. To explore extrapolation to future data points, we further introduce a new data-to-text generation dataset, TempWikiBio, containing more than 4 millions of chronologically ordered revisions of biographical articles from English Wikipedia, each paired with structured personal profiles.Through data-to-text generation on TempWikiBio, text-to-text generation on the content transfer dataset, and summarization on XSum,we show that linear prompts on encoder and textual prompts improve the generation quality on all datasets.Despite having less performance drop when testing on data drawn from a later time, linear prompts focus more on non-temporal information and are less sensitive to the given timestamps, according to human evaluations and sensitivity analyses.Meanwhile, textual prompts establish the association between the given timestamps and the output dates, yielding more factual temporal information in the output.
Anthology ID:
2022.findings-emnlp.535
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7231–7246
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.535
DOI:
Bibkey:
Cite (ACL):
Shuyang Cao and Lu Wang. 2022. Time-aware Prompting for Text Generation. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 7231–7246, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Time-aware Prompting for Text Generation (Cao & Wang, Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.findings-emnlp.535.pdf