CompAct: Compressing Retrieved Documents Actively for Question Answering

Chanwoong Yoon, Taewhoo Lee, Hyeon Hwang, Minbyul Jeong, Jaewoo Kang


Abstract
Retrieval-augmented generation supports language models to strengthen their factual groundings by providing external contexts. However, language models often face challenges when given extensive information, diminishing their effectiveness in solving questions. Context compression tackles this issue by filtering out irrelevant information, but current methods still struggle in realistic scenarios where crucial information cannot be captured with a single-step approach. To overcome this limitation, we introduce CompAct, a novel framework that employs an active strategy to condense extensive documents without losing key information. Our experiments demonstrate that CompAct brings significant improvements in both performance and compression rate on multi-hop question-answering benchmarks. CompAct flexibly operates as a cost-efficient plug-in module with various off-the-shelf retrievers or readers, achieving exceptionally high compression rates (47x).
Anthology ID:
2024.emnlp-main.1194
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
21424–21439
Language:
URL:
https://preview.aclanthology.org/jlcl-multiple-ingestion/2024.emnlp-main.1194/
DOI:
10.18653/v1/2024.emnlp-main.1194
Bibkey:
Cite (ACL):
Chanwoong Yoon, Taewhoo Lee, Hyeon Hwang, Minbyul Jeong, and Jaewoo Kang. 2024. CompAct: Compressing Retrieved Documents Actively for Question Answering. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 21424–21439, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
CompAct: Compressing Retrieved Documents Actively for Question Answering (Yoon et al., EMNLP 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/jlcl-multiple-ingestion/2024.emnlp-main.1194.pdf