Abstract
A long-term ambition of information seeking QA systems is to reason over multi-modal contexts and generate natural answers to user queries. Today, memory intensive pre-trained language models are adapted to downstream tasks such as QA by fine-tuning the model on QA data in a specific modality like unstructured text or structured tables. To avoid training such memory-hungry models while utilizing a uniform architecture for each modality, parameter-efficient adapters add and train small task-specific bottle-neck layers between transformer layers. In this work, we study parameter-efficient abstractive QA in encoder-decoder models over structured tabular data and unstructured textual data using only 1.5% additional parameters for each modality. We also ablate over adapter layers in both encoder and decoder modules to study the efficiency-performance trade-off and demonstrate that reducing additional trainable parameters down to 0.7%-1.0% leads to comparable results. Our models out-perform current state-of-the-art models on tabular QA datasets such as Tablesum and FeTaQA, and achieve comparable performance on a textual QA dataset such as NarrativeQA using significantly less trainable parameters than fine-tuning.- Anthology ID:
- 2022.dialdoc-1.5
- Volume:
- Proceedings of the Second DialDoc Workshop on Document-grounded Dialogue and Conversational Question Answering
- Month:
- May
- Year:
- 2022
- Address:
- Dublin, Ireland
- Editors:
- Song Feng, Hui Wan, Caixia Yuan, Han Yu
- Venue:
- dialdoc
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 41–53
- Language:
- URL:
- https://aclanthology.org/2022.dialdoc-1.5
- DOI:
- 10.18653/v1/2022.dialdoc-1.5
- Cite (ACL):
- Vaishali Pal, Evangelos Kanoulas, and Maarten Rijke. 2022. Parameter-Efficient Abstractive Question Answering over Tables or Text. In Proceedings of the Second DialDoc Workshop on Document-grounded Dialogue and Conversational Question Answering, pages 41–53, Dublin, Ireland. Association for Computational Linguistics.
- Cite (Informal):
- Parameter-Efficient Abstractive Question Answering over Tables or Text (Pal et al., dialdoc 2022)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2022.dialdoc-1.5.pdf
- Code
- kolk/pea-qa
- Data
- NarrativeQA