MAFiD: Moving Average Equipped Fusion-in-Decoder for Question Answering over Tabular and Textual Data
Sung-Min Lee, Eunhwan Park, Daeryong Seo, Donghyeon Jeon, Inho Kang, Seung-Hoon Na
Abstract
Transformer-based models for question answering (QA) over tables and texts confront a “long” hybrid sequence over tabular and textual elements, causing long-range reasoning problems. To handle long-range reasoning, we extensively employ a fusion-in-decoder (FiD) and exponential moving average (EMA), proposing a Moving Average Equipped Fusion-in-Decoder (MAFiD). With FiD as the backbone architecture, MAFiD combines various levels of reasoning: independent encoding of homogeneous data and single-row and multi-row heterogeneous reasoning, using a gated cross attention layer to effectively aggregate the three types of representations resulting from various reasonings. Experimental results on HybridQA indicate that MAFiD achieves state-of-the-art performance by increasing exact matching (EM) and F1 by 1.1 and 1.7, respectively, on the blind test set.- Anthology ID:
- 2023.findings-eacl.177
- Volume:
- Findings of the Association for Computational Linguistics: EACL 2023
- Month:
- May
- Year:
- 2023
- Address:
- Dubrovnik, Croatia
- Editors:
- Andreas Vlachos, Isabelle Augenstein
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2337–2344
- Language:
- URL:
- https://aclanthology.org/2023.findings-eacl.177
- DOI:
- 10.18653/v1/2023.findings-eacl.177
- Cite (ACL):
- Sung-Min Lee, Eunhwan Park, Daeryong Seo, Donghyeon Jeon, Inho Kang, and Seung-Hoon Na. 2023. MAFiD: Moving Average Equipped Fusion-in-Decoder for Question Answering over Tabular and Textual Data. In Findings of the Association for Computational Linguistics: EACL 2023, pages 2337–2344, Dubrovnik, Croatia. Association for Computational Linguistics.
- Cite (Informal):
- MAFiD: Moving Average Equipped Fusion-in-Decoder for Question Answering over Tabular and Textual Data (Lee et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/fix-volume-bibkeys/2023.findings-eacl.177.pdf