Tom Völker
Also published as: Tom Volker
2025
SALT at SemEval-2025 Task 2: A SQL-based Approach for LLM-Free Entity-Aware-Translation
Tom Volker
|
Jan Pfister
|
Andreas Hotho
Proceedings of the 19th International Workshop on Semantic Evaluation (SemEval-2025)
Entity-aware machine translation faces significant challenges when translating culturally-adapted named entities that require knowledge beyond the source text.We present SALT (SQL-based Approach for LLM-Free Entity-Aware-Translation), a parameter-efficient system for the SemEval-2025 Task 2.Our approach combines SQL-based entity retrieval with constrained neural translation via logit biasing and explicit entity annotations.Despite its simplicity, it achieves state-of-the-art performance (First Place) among approaches not using gold-standard data, while requiring far less computation than LLM-based methods.Our ablation studies show simple SQL-based retrieval rivals complex neural models, and strategic model refinement outperforms increased model complexity.SALT offers an alternative to resource-intensive LLM-based approaches, achieving comparable results with only a fraction of the parameters.
BARTABSA++: Revisiting BARTABSA with Decoder LLMs
Jan Pfister
|
Tom Völker
|
Anton Vlasjuk
|
Andreas Hotho
Proceedings of the 1st Joint Workshop on Large Language Models and Structure Modeling (XLLM 2025)
We revisit the BARTABSA framework for aspect-based sentiment analysis with modern decoder LLMs to assess the importance of explicit structure modeling today. Our updated implementation - BARTABSA++ - features architectural enhancements that boost performance and training stability.Systematic testing with various encoder-decoder architectures shows that BARTABSA++ with BART-Large achieves state-of-the-art results, even surpassing a finetuned GPT-4o model.Our analysis indicates the encoder’s representational quality is vital, while the decoder’s role is minimal, explaining the limited benefits of scaling decoder-only LLMs for this task. These findings underscore the complementary roles of explicit structured modeling and large language models, indicating structured approaches remain competitive for tasks requiring precise relational information extraction.