An Inner Table Retriever for Robust Table Question Answering

Weizhe Lin, Rexhina Blloshmi, Bill Byrne, Adria de Gispert, Gonzalo Iglesias


Abstract
Recent years have witnessed the thriving of pretrained Transformer-based language models for understanding semi-structured tables, with several applications, such as Table Question Answering (TableQA).These models are typically trained on joint tables and surrounding natural language text, by linearizing table content into sequences comprising special tokens and cell information. This yields very long sequences which increase system inefficiency, and moreover, simply truncating long sequences results in information loss for downstream tasks. We propose Inner Table Retriever (ITR), a general-purpose approach for handling long tables in TableQA that extracts sub-tables to preserve the most relevant information for a question. We show that ITR can be easily integrated into existing systems to improve their accuracy with up to 1.3-4.8% and achieve state-of-the-art results in two benchmarks, i.e., 63.4% in WikiTableQuestions and 92.1% in WikiSQL. Additionally, we show that ITR makes TableQA systems more robust to reduced model capacity and to different ordering of columns and rows. We make our code available at: https://github.com/amazon-science/robust-tableqa.
Anthology ID:
2023.acl-long.551
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9909–9926
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.acl-long.551/
DOI:
10.18653/v1/2023.acl-long.551
Bibkey:
Cite (ACL):
Weizhe Lin, Rexhina Blloshmi, Bill Byrne, Adria de Gispert, and Gonzalo Iglesias. 2023. An Inner Table Retriever for Robust Table Question Answering. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 9909–9926, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
An Inner Table Retriever for Robust Table Question Answering (Lin et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.acl-long.551.pdf
Video:
 https://preview.aclanthology.org/build-pipeline-with-new-library/2023.acl-long.551.mp4