Bowen Song


2025

pdf bib
Divide-Then-Align: Honest Alignment based on the Knowledge Boundary of RAG
Xin Sun | Jianan Xie | Zhongqi Chen | Qiang Liu | Shu Wu | Yuehe Chen | Bowen Song | Zilei Wang | Weiqiang Wang | Liang Wang
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)

Large language models (LLMs) augmented with retrieval systems have significantly advanced natural language processing tasks by integrating external knowledge sources, enabling more accurate and contextually rich responses. To improve the robustness of such systems against noisy retrievals, Retrieval-Augmented Fine-Tuning (RAFT) has emerged as a widely adopted method. However, RAFT conditions models to generate answers even in the absence of reliable knowledge. This behavior undermines their reliability in high-stakes domains, where acknowledging uncertainty is critical. To address this issue, we propose Divide-Then-Align (DTA), a post-training approach designed to endow RAG systems with the ability to respond with “I don’t know” when the query is out of the knowledge boundary of both the retrieved passages and the model’s internal knowledge. DTA divides data samples into four knowledge quadrants and constructs tailored preference data for each quadrant, resulting in a curated dataset for Direct Preference Optimization (DPO). Experimental results on three benchmark datasets demonstrate that effectively balances accuracy with appropriate abstention, enhancing the reliability and trustworthiness of retrieval-augmented systems.

2024

pdf bib
Knowledge GeoGebra: Leveraging Geometry of Relation Embeddings in Knowledge Graph Completion
Kossi Amouzouvi | Bowen Song | Sahar Vahdati | Jens Lehmann
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)

Knowledge graph embedding (KGE) models provide a low-dimensional representation of knowledge graphs in continuous vector spaces. This representation learning enables different downstream AI tasks such as link prediction for graph completion. However, most embedding models are only designed considering the algebra and geometry of the entity embedding space, the algebra of the relation embedding space, and the interaction between relation and entity embeddings. Neglecting the geometry of relation embedding limits the optimization of entity and relation distribution leading to suboptimal performance of knowledge graph completion. To address this issue, we propose a new perspective in the design of KGEs by looking into the geometry of relation embedding space. The proposed method and its variants are developed on top of an existing framework, RotatE, from which we leverage the geometry of the relation embeddings by mutating the unit circle to an ellipse, and further generalize it with the concept of a butterfly curve, consecutively. Besides the theoretical abilities of the model in preserving topological and relational patterns, the experiments on the WN18RR, FB15K-237 and YouTube benchmarks showed that this new family of KGEs can challenge or outperform state-of-the-art models.