Semantic Spatial Representation: a unique representation of an environment based on an ontology for robotic applications

Guillaume Sarthou, Aurélie Clodic, Rachid Alami


Abstract
It is important, for human-robot interaction, to endow the robot with the knowledge necessary to understand human needs and to be able to respond to them. We present a formalized and unified representation for indoor environments using an ontology devised for a route description task in which a robot must provide explanations to a person. We show that this representation can be used to choose a route to explain to a human as well as to verbalize it using a route perspective. Based on ontology, this representation has a strong possibility of evolution to adapt to many other applications. With it, we get the semantics of the environment elements while keeping a description of the known connectivity of the environment. This representation and the illustration algorithms, to find and verbalize a route, have been tested in two environments of different scales.
Anthology ID:
W19-1606
Volume:
Proceedings of the Combined Workshop on Spatial Language Understanding (SpLU) and Grounded Communication for Robotics (RoboNLP)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Archna Bhatia, Yonatan Bisk, Parisa Kordjamshidi, Jesse Thomason
Venue:
RoboNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
50–60
Language:
URL:
https://aclanthology.org/W19-1606
DOI:
10.18653/v1/W19-1606
Bibkey:
Cite (ACL):
Guillaume Sarthou, Aurélie Clodic, and Rachid Alami. 2019. Semantic Spatial Representation: a unique representation of an environment based on an ontology for robotic applications. In Proceedings of the Combined Workshop on Spatial Language Understanding (SpLU) and Grounded Communication for Robotics (RoboNLP), pages 50–60, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Semantic Spatial Representation: a unique representation of an environment based on an ontology for robotic applications (Sarthou et al., RoboNLP 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/W19-1606.pdf