Abstract
In this paper, we explore the task of automatically generating natural language descriptions of salient patterns in a time series, such as stock prices of a company over a week. A model for this task should be able to extract high-level patterns such as presence of a peak or a dip. While typical contemporary neural models with attention mechanisms can generate fluent output descriptions for this task, they often generate factually incorrect descriptions. We propose a computational model with a truth-conditional architecture which first runs small learned programs on the input time series, then identifies the programs/patterns which hold true for the given input, and finally conditions on *only* the chosen valid program (rather than the input time series) to generate the output text description. A program in our model is constructed from modules, which are small neural networks that are designed to capture numerical patterns and temporal information. The modules are shared across multiple programs, enabling compositionality as well as efficient learning of module parameters. The modules, as well as the composition of the modules, are unobserved in data, and we learn them in an end-to-end fashion with the only training signal coming from the accompanying natural language text descriptions. We find that the proposed model is able to generate high-precision captions even though we consider a small and simple space of module types.- Anthology ID:
- 2021.emnlp-main.55
- Volume:
- Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2021
- Address:
- Online and Punta Cana, Dominican Republic
- Editors:
- Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 719–733
- Language:
- URL:
- https://aclanthology.org/2021.emnlp-main.55
- DOI:
- 10.18653/v1/2021.emnlp-main.55
- Cite (ACL):
- Harsh Jhamtani and Taylor Berg-Kirkpatrick. 2021. Truth-Conditional Captions for Time Series Data. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 719–733, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Cite (Informal):
- Truth-Conditional Captions for Time Series Data (Jhamtani & Berg-Kirkpatrick, EMNLP 2021)
- PDF:
- https://preview.aclanthology.org/naacl24-info/2021.emnlp-main.55.pdf
- Code
- harsh19/truce