BARTABSA++: Revisiting BARTABSA with Decoder LLMs

Jan Pfister, Tom Völker, Anton Vlasjuk, Andreas Hotho


Abstract
We revisit the BARTABSA framework for aspect-based sentiment analysis with modern decoder LLMs to assess the importance of explicit structure modeling today. Our updated implementation - BARTABSA++ - features architectural enhancements that boost performance and training stability.Systematic testing with various encoder-decoder architectures shows that BARTABSA++ with BART-Large achieves state-of-the-art results, even surpassing a finetuned GPT-4o model.Our analysis indicates the encoder’s representational quality is vital, while the decoder’s role is minimal, explaining the limited benefits of scaling decoder-only LLMs for this task. These findings underscore the complementary roles of explicit structured modeling and large language models, indicating structured approaches remain competitive for tasks requiring precise relational information extraction.
Anthology ID:
2025.xllm-1.13
Volume:
Proceedings of the 1st Joint Workshop on Large Language Models and Structure Modeling (XLLM 2025)
Month:
August
Year:
2025
Address:
Vienna, Austria
Editors:
Hao Fei, Kewei Tu, Yuhui Zhang, Xiang Hu, Wenjuan Han, Zixia Jia, Zilong Zheng, Yixin Cao, Meishan Zhang, Wei Lu, N. Siddharth, Lilja Øvrelid, Nianwen Xue, Yue Zhang
Venues:
XLLM | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
115–128
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.xllm-1.13/
DOI:
Bibkey:
Cite (ACL):
Jan Pfister, Tom Völker, Anton Vlasjuk, and Andreas Hotho. 2025. BARTABSA++: Revisiting BARTABSA with Decoder LLMs. In Proceedings of the 1st Joint Workshop on Large Language Models and Structure Modeling (XLLM 2025), pages 115–128, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
BARTABSA++: Revisiting BARTABSA with Decoder LLMs (Pfister et al., XLLM 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.xllm-1.13.pdf