Instruction-tuned QwenChart for Chart Question Answering

Viviana Ventura, Lukas Amadeus Kleybolte, Alessandra Zarcone


Abstract
Charts, where information is delivered holis-tically by visual and textual features, repre-sent a challenge when it comes to downstreamtasks such as chart question answering, whereboth kinds of information contribute to the task.The standard approach is to decouple the taskin two steps, first extracting information fromthe charts, or representing it as a table, textor code, and then a second reasoning step tooutput the answers. Today, the advancementsin visual encoding of Visual Large LanguageModels (VLLM) have shown their capabilitiesto solve such complex tasks without using in-between representations of the charts or mas-sive in-domain training. Our new instructionfine-tuned and chain-of-thought model Qwen-Chart showed that even in a complex newbenchmark such as SciVQA general modelscan achieve great performances with low-costtraining, matching the capabilities that LLMshave showed in unimodal downstream tasks.An out-of-domain evaluation showed satisfac-tory results, albeit with an expected drop inperformance.
Anthology ID:
2025.sdp-1.22
Volume:
Proceedings of the Fifth Workshop on Scholarly Document Processing (SDP 2025)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Tirthankar Ghosal, Philipp Mayr, Amanpreet Singh, Aakanksha Naik, Georg Rehm, Dayne Freitag, Dan Li, Sonja Schimmler, Anita De Waard
Venues:
sdp | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
240–251
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.sdp-1.22/
DOI:
10.18653/v1/2025.sdp-1.22
Bibkey:
Cite (ACL):
Viviana Ventura, Lukas Amadeus Kleybolte, and Alessandra Zarcone. 2025. Instruction-tuned QwenChart for Chart Question Answering. In Proceedings of the Fifth Workshop on Scholarly Document Processing (SDP 2025), pages 240–251, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Instruction-tuned QwenChart for Chart Question Answering (Ventura et al., sdp 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.sdp-1.22.pdf