Cory Hayes

Also published as: Cory J. Hayes


2019

pdf bib
A Research Platform for Multi-Robot Dialogue with Humans
Matthew Marge | Stephen Nogar | Cory J. Hayes | Stephanie M. Lukin | Jesse Bloecker | Eric Holder | Clare Voss
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics (Demonstrations)

This paper presents a research platform that supports spoken dialogue interaction with multiple robots. The demonstration showcases our crafted MultiBot testing scenario in which users can verbally issue search, navigate, and follow instructions to two robotic teammates: a simulated ground robot and an aerial robot. This flexible language and robotic platform takes advantage of existing tools for speech recognition and dialogue management that are compatible with new domains, and implements an inter-agent communication protocol (tactical behavior specification), where verbal instructions are encoded for tasks assigned to the appropriate robot.

2018

pdf bib
ScoutBot: A Dialogue System for Collaborative Navigation
Stephanie M. Lukin | Felix Gervits | Cory J. Hayes | Pooja Moolchandani | Anton Leuski | John G. Rogers III | Carlos Sanchez Amaro | Matthew Marge | Clare R. Voss | David Traum
Proceedings of ACL 2018, System Demonstrations

ScoutBot is a dialogue interface to physical and simulated robots that supports collaborative exploration of environments. The demonstration will allow users to issue unconstrained spoken language commands to ScoutBot. ScoutBot will prompt for clarification if the user’s instruction needs additional input. It is trained on human-robot dialogue collected from Wizard-of-Oz experiments, where robot responses were initiated by a human wizard in previous interactions. The demonstration will show a simulated ground robot (Clearpath Jackal) in a simulated environment supported by ROS (Robot Operating System).

pdf bib
Dialogue Structure Annotation for Multi-Floor Interaction
David Traum | Cassidy Henry | Stephanie Lukin | Ron Artstein | Felix Gervits | Kimberly Pollard | Claire Bonial | Su Lei | Clare Voss | Matthew Marge | Cory Hayes | Susan Hill
Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018)

2017

pdf bib
Exploring Variation of Natural Human Commands to a Robot in a Collaborative Navigation Task
Matthew Marge | Claire Bonial | Ashley Foots | Cory Hayes | Cassidy Henry | Kimberly Pollard | Ron Artstein | Clare Voss | David Traum
Proceedings of the First Workshop on Language Grounding for Robotics

Robot-directed communication is variable, and may change based on human perception of robot capabilities. To collect training data for a dialogue system and to investigate possible communication changes over time, we developed a Wizard-of-Oz study that (a) simulates a robot’s limited understanding, and (b) collects dialogues where human participants build a progressively better mental model of the robot’s understanding. With ten participants, we collected ten hours of human-robot dialogue. We analyzed the structure of instructions that participants gave to a remote robot before it responded. Our findings show a general initial preference for including metric information (e.g., move forward 3 feet) over landmarks (e.g., move to the desk) in motion commands, but this decreased over time, suggesting changes in perception.