Grouping Entities with Shared Properties using Multi-Facet Prompting and Property Embeddings
Amit Gajbhiye, Thomas Bailleux, Zied Bouraoui, Luis Espinosa-Anke, Steven Schockaert
Abstract
Methods for learning taxonomies from data have been widely studied. We study a specific version of this task, called commonality identification, where only the set of entities is given and we need to find meaningful ways to group those entities. While LLMs should intuitively excel at this task, it is difficult to directly use such models in large domains. In this paper, we instead use LLMs to describe the different properties that are satisfied by each of the entities individually. We then use pre-trained embeddings to cluster these properties, and finally group entities that have properties which belong to the same cluster. To achieve good results, it is paramount that the properties predicted by the LLM are sufficiently diverse. We find that this diversity can be improved by prompting the LLM to structure the predicted properties into different facets of knowledge.- Anthology ID:
- 2025.emnlp-main.787
- Volume:
- Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2025
- Address:
- Suzhou, China
- Editors:
- Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 15611–15626
- Language:
- URL:
- https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.787/
- DOI:
- Cite (ACL):
- Amit Gajbhiye, Thomas Bailleux, Zied Bouraoui, Luis Espinosa-Anke, and Steven Schockaert. 2025. Grouping Entities with Shared Properties using Multi-Facet Prompting and Property Embeddings. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 15611–15626, Suzhou, China. Association for Computational Linguistics.
- Cite (Informal):
- Grouping Entities with Shared Properties using Multi-Facet Prompting and Property Embeddings (Gajbhiye et al., EMNLP 2025)
- PDF:
- https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.787.pdf