Xiran Fan
2025
Enhancing Foundation Models in Transaction Understanding with LLM-based Sentence Embeddings
Xiran Fan
|
Zhimeng Jiang
|
Chin-Chia Michael Yeh
|
Yuzhong Chen
|
Yingtong Dou
|
Menghai Pan
|
Yan Zheng
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
The ubiquity of payment networks generates vast transactional data encoding rich consumer and merchant behavioral patterns. Recent foundation models for transaction analysis process tabular data sequentially but rely on index-based representations for categorical merchant fields, causing substantial semantic information loss by converting rich textual data into discrete tokens. While Large Language Models (LLMs) can address this limitation through superior semantic understanding, their computational overhead challenges real-time financial deployment. We introduce a hybrid framework that uses LLM-generated embeddings as semantic initializations for lightweight transaction models, balancing interpretability with operational efficiency. Our approach employs multi-source data fusion to enrich merchant categorical fields and a one-word constraint principle for consistent embedding generation across LLM architectures. We systematically address data quality through noise filtering and context-aware enrichment. Experiments on large-scale transaction datasets demonstrate significant performance improvements across multiple transaction understanding tasks.
2024
Enhancing Hyperbolic Knowledge Graph Embeddings via Lorentz Transformations
Xiran Fan
|
Minghua Xu
|
Huiyuan Chen
|
Yuzhong Chen
|
Mahashweta Das
|
Hao Yang
Findings of the Association for Computational Linguistics: ACL 2024
Knowledge Graph Embedding (KGE) is a powerful technique for predicting missing links in Knowledge Graphs (KGs) by learning the entities and relations. Hyperbolic space has emerged as a promising embedding space for KGs due to its ability to represent hierarchical data. Nevertheless, most existing hyperbolic KGE methods rely on tangent approximation and are not fully hyperbolic, resulting in distortions and inaccuracies. To overcome this limitation, we propose LorentzKG, a fully hyperbolic KGE method that represents entities as points in the Lorentz model and represents relations as the intrinsic transformation—the Lorentz transformations between entities. We demonstrate that the Lorentz transformation, which can be decomposed into Lorentz rotation/reflection and Lorentz boost, captures various types of relations including hierarchical structures. Experimental results show that our LorentzKG achieves state-of-the-art performance.
Search
Fix author
Co-authors
- Yuzhong Chen 2
- Huiyuan Chen 1
- Mahashweta Das 1
- Yingtong Dou 1
- Zhimeng Jiang 1
- show all...