Miltiadis Marios Katsakioris
2021
Learning to Read Maps: Understanding Natural Language Instructions from Unseen Maps
Miltiadis Marios Katsakioris
|
Ioannis Konstas
|
Pierre Yves Mignotte
|
Helen Hastie
Proceedings of Second International Combined Workshop on Spatial Language Understanding and Grounded Communication for Robotics
Robust situated dialog requires the ability to process instructions based on spatial information, which may or may not be available. We propose a model, based on LXMERT, that can extract spatial information from text instructions and attend to landmarks on OpenStreetMap (OSM) referred to in a natural language instruction. Whilst, OSM is a valuable resource, as with any open-sourced data, there is noise and variation in the names referred to on the map, as well as, variation in natural language instructions, hence the need for data-driven methods over rule-based systems. This paper demonstrates that the gold GPS location can be accurately predicted from the natural language instruction and metadata with 72% accuracy for previously seen maps and 64% for unseen maps.
2019
Corpus of Multimodal Interaction for Collaborative Planning
Miltiadis Marios Katsakioris
|
Helen Hastie
|
Ioannis Konstas
|
Atanas Laskov
Proceedings of the Combined Workshop on Spatial Language Understanding (SpLU) and Grounded Communication for Robotics (RoboNLP)
As autonomous systems become more commonplace, we need a way to easily and naturally communicate to them our goals and collaboratively come up with a plan on how to achieve these goals. To this end, we conducted a Wizard of Oz study to gather data and investigate the way operators would collaboratively make plans via a conversational ‘planning assistant’ for remote autonomous systems. We present here a corpus of 22 dialogs from expert operators, which can be used to train such a system. Data analysis shows that multimodality is key to successful interaction, measured both quantitatively and qualitatively via user feedback.
Search