Omar Zoloev
2025
LATTE: Learning Aligned Transactions and Textual Embeddings for Bank Clients
Egor Fadeev
|
Dzhambulat Mollaev
|
Aleksei Shestov
|
Dima Korolev
|
Omar Zoloev
|
Ivan A Kireev
|
Andrey Savchenko
|
Maksim Makarenko
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
Learning clients embeddings from sequences of their historic communications is central to financial applications. While large language models (LLMs) offer general world knowledge, their direct use on long event sequences is computationally expensive and impractical in real-world pipelines. In this paper, we propose , a contrastive learning framework that aligns raw event embeddings with description-based semantic embeddings from frozen LLMs. Behavioral features based on statistical user descriptions are summarized into short prompts, embedded by the LLM, and used as supervision via contrastive loss. The proposed approach significantly reduces inference cost and input size compared to the conventional processing of complete sequences by LLM. We experimentally show that our method outperforms state-of-the-art techniques for learning event sequence representations on real-world financial datasets while remaining deployable in latency-sensitive environments.
Search
Fix author
Co-authors
- Egor Fadeev 1
- Ivan A Kireev 1
- Dima Korolev 1
- Maksim Makarenko 1
- Dzhambulat Mollaev 1
- show all...