Abstract
Analogy is one of the core capacities of human cognition; when faced with new situations, we often transfer prior experience from other domains. Most work on computational analogy relies heavily on complex, manually crafted input. In this work, we relax the input requirements, requiring only names of entities to be mapped. We automatically extract commonsense representations and use them to identify a mapping between the entities. Unlike previous works, our framework can handle partial analogies and suggest new entities to be added. Moreover, our method’s output is easily interpretable, allowing for users to understand why a specific mapping was chosen. Experiments show that our model correctly maps 81.2% of classical 2x2 analogy problems (guess level=50%). On larger problems, it achieves 77.8% accuracy (mean guess level=13.1%). In another experiment, we show our algorithm outperforms human performance, and the automatic suggestions of new entities resemble those suggested by humans. We hope this work will advance computational analogy by paving the way to more flexible, realistic input requirements, with broader applicability.- Anthology ID:
- 2023.emnlp-main.1023
- Volume:
- Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 16426–16442
- Language:
- URL:
- https://aclanthology.org/2023.emnlp-main.1023
- DOI:
- 10.18653/v1/2023.emnlp-main.1023
- Cite (ACL):
- Shahar Jacob, Chen Shani, and Dafna Shahaf. 2023. FAME: Flexible, Scalable Analogy Mappings Engine. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 16426–16442, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- FAME: Flexible, Scalable Analogy Mappings Engine (Jacob et al., EMNLP 2023)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/2023.emnlp-main.1023.pdf