Luke DeLuccia
2018
SMILEE: Symmetric Multi-modal Interactions with Language-gesture Enabled (AI) Embodiment
Sujeong Kim
|
David Salter
|
Luke DeLuccia
|
Kilho Son
|
Mohamed R. Amer
|
Amir Tamrakar
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Demonstrations
We demonstrate an intelligent conversational agent system designed for advancing human-machine collaborative tasks. The agent is able to interpret a user’s communicative intent from both their verbal utterances and non-verbal behaviors, such as gestures. The agent is also itself able to communicate both with natural language and gestures, through its embodiment as an avatar thus facilitating natural symmetric multi-modal interactions. We demonstrate two intelligent agents with specialized skills in the Blocks World as use-cases of our system.
Search