A Sequence-to-Sequence&Set Model for Text-to-Table Generation

Tong Li, Zhihao Wang, Liangying Shao, Xuling Zheng, Xiaoli Wang, Jinsong Su


Abstract
Recently, the text-to-table generation task has attracted increasing attention due to its wide applications. In this aspect, the dominant model formalizes this task as a sequence-to-sequence generation task and serializes each table into a token sequence during training by concatenating all rows in a top-down order. However, it suffers from two serious defects: 1) the predefined order introduces a wrong bias during training, which highly penalizes shifts in the order between rows; 2) the error propagation problem becomes serious when the model outputs a long token sequence. In this paper, we first conduct a preliminary study to demonstrate the generation of most rows is order-insensitive. Furthermore, we propose a novel sequence-to-sequence&set text-to-table generation model. Specifically, in addition to a text encoder encoding the input text, our model is equipped with a table header generator to first output a table header, i.e., the first row of the table, in the manner of sequence generation. Then we use a table body generator with learnable row embeddings and column embeddings to generate a set of table body rows in parallel. Particularly, to deal with the issue that there is no correspondence between each generated table body row and target during training, we propose a target assignment strategy based on the bipartite matching between the first cells of generated table body rows and targets. Experiment results show that our model significantly surpasses the baselines, achieving state-of-the-art performance on commonly-used datasets.
Anthology ID:
2023.findings-acl.330
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5358–5370
Language:
URL:
https://aclanthology.org/2023.findings-acl.330
DOI:
10.18653/v1/2023.findings-acl.330
Bibkey:
Cite (ACL):
Tong Li, Zhihao Wang, Liangying Shao, Xuling Zheng, Xiaoli Wang, and Jinsong Su. 2023. A Sequence-to-Sequence&Set Model for Text-to-Table Generation. In Findings of the Association for Computational Linguistics: ACL 2023, pages 5358–5370, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
A Sequence-to-Sequence&Set Model for Text-to-Table Generation (Li et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2023.findings-acl.330.pdf