Can Large Language Models Understand DL-Lite Ontologies? An Empirical Study

Keyu Wang, Guilin Qi, Jiaqi Li, Songlin Zhai


Abstract
Large language models (LLMs) have shown significant achievements in solving a wide range of tasks. Recently, LLMs’ capability to store, retrieve and infer with symbolic knowledge has drawn a great deal of attention, showing their potential to understand structured information. However, it is not yet known whether LLMs can understand Description Logic (DL) ontologies. In this work, we empirically analyze the LLMs’ capability of understanding DL-Lite ontologies covering 6 representative tasks from syntactic and semantic aspects. With extensive experiments, we demonstrate both the effectiveness and limitations of LLMs in understanding DL-Lite ontologies. We find that LLMs can understand formal syntax and model-theoretic semantics of concepts and roles. However, LLMs struggle with understanding TBox NI transitivity and handling ontologies with large ABoxes. We hope that our experiments and analyses provide more insights into LLMs and inspire to build more faithful knowledge engineering solutions.
Anthology ID:
2024.findings-emnlp.141
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2503–2519
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2024.findings-emnlp.141/
DOI:
10.18653/v1/2024.findings-emnlp.141
Bibkey:
Cite (ACL):
Keyu Wang, Guilin Qi, Jiaqi Li, and Songlin Zhai. 2024. Can Large Language Models Understand DL-Lite Ontologies? An Empirical Study. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 2503–2519, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Can Large Language Models Understand DL-Lite Ontologies? An Empirical Study (Wang et al., Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2024.findings-emnlp.141.pdf
Data:
 2024.findings-emnlp.141.data.zip