2024
pdf
abs
Limits of Theory of Mind Modelling in Dialogue-Based Collaborative Plan Acquisition
Matteo Bortoletto
|
Constantin Ruhdorfer
|
Adnen Abdessaied
|
Lei Shi
|
Andreas Bulling
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Recent work on dialogue-based collaborative plan acquisition (CPA) has suggested that Theory of Mind (ToM) modelling can improve missing knowledge prediction in settings with asymmetric skill-sets and knowledge. Although ToM was claimed to be important for effective collaboration, its real impact on this novel task remains under-explored. By representing plans as graphs and by exploiting task-specific constraints we show that, as performance on CPA nearly doubles when predicting one’s own missing knowledge, the improvements due to ToM modelling diminish. This phenomenon persists even when evaluating existing baseline methods. To better understand the relevance of ToM for CPA, we report a principled performance comparison of models with and without ToM features. Results across different models and ablations consistently suggest that learned ToM features are indeed more likely to reflect latent patterns in the data with no perceivable link to ToM. This finding calls for a deeper understanding of the role of ToM in CPA and beyond, as well as new methods for modelling and evaluating mental states in computational collaborative agents.
pdf
abs
LEGENT: Open Platform for Embodied Agents
Zhili Cheng
|
Zhitong Wang
|
Jinyi Hu
|
Shengding Hu
|
An Liu
|
Yuge Tu
|
Pengkai Li
|
Lei Shi
|
Zhiyuan Liu
|
Maosong Sun
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 3: System Demonstrations)
Despite advancements in Large Language Models (LLMs) and Large Multimodal Models (LMMs), their integration into language-grounded, human-like embodied agents remains incomplete, hindering complex real-life task performance in 3D environments. Existing integrations often feature limited open-sourcing, challenging collective progress in this field. We introduce LEGENT, an open, scalable platform for developing embodied agents using LLMs and LMMs. LEGENT offers a dual approach: a rich 3D environment with interactive, communicable, and actionable agents, paired with a user-friendly interface, and a sophisticated data generation pipeline utilizing advanced algorithms to exploit supervision from simulated worlds at scale. In our experiments, an embryonic vision-language-action model trained on LEGENT-generated data surpasses GPT-4V in embodied tasks, showcasing promising generalization capabilities. The demo video is available at the following link https://video.legent.ai.
2016
pdf
abs
Latent Topic Embedding
Di Jiang
|
Lei Shi
|
Rongzhong Lian
|
Hua Wu
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers
Topic modeling and word embedding are two important techniques for deriving latent semantics from data. General-purpose topic models typically work in coarse granularity by capturing word co-occurrence at the document/sentence level. In contrast, word embedding models usually work in much finer granularity by modeling word co-occurrence within small sliding windows. With the aim of deriving latent semantics by considering word co-occurrence at different levels of granularity, we propose a novel model named Latent Topic Embedding (LTE), which seamlessly integrates topic generation and embedding learning in one unified framework. We further propose an efficient Monte Carlo EM algorithm to estimate the parameters of interest. By retaining the individual advantages of topic modeling and word embedding, LTE results in better latent topics and word embedding. Extensive experiments verify the superiority of LTE over the state-of-the-arts.
2014
pdf
Unsupervised Template Mining for Semantic Category Understanding
Lei Shi
|
Shuming Shi
|
Chin-Yew Lin
|
Yi-Dong Shen
|
Yong Rui
Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP)
2010
pdf
Cross Language Text Classification by Model Translation and Semi-Supervised Learning
Lei Shi
|
Rada Mihalcea
|
Mingjun Tian
Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing
2008
pdf
Improved Sentence Alignment on Parallel Web Pages Using a Stochastic Tree Alignment Model
Lei Shi
|
Ming Zhou
Proceedings of the 2008 Conference on Empirical Methods in Natural Language Processing
2006
pdf
A DOM Tree Alignment Model for Mining Parallel Data from the Web
Lei Shi
|
Cheng Niu
|
Ming Zhou
|
Jianfeng Gao
Proceedings of the 21st International Conference on Computational Linguistics and 44th Annual Meeting of the Association for Computational Linguistics
2004
pdf
An algorithm for open text semantic parsing
Lei Shi
|
Rada Mihalcea
Proceedings of the 3rd workshop on RObust Methods in Analysis of Natural Language Data (ROMAND 2004)
pdf
Open Text Semantic Parsing Using FrameNet and WordNet
Lei Shi
|
Rada Mihalcea
Demonstration Papers at HLT-NAACL 2004