Situated natural language interaction in uncertain and open worlds

February 16, 2017
Halligan 209
Speaker: Thomas Williams, Tufts University
Host: Matthias Scheutz


Natural language understanding and generation capabilities are crucial for natural human-like human robot interactions. This is especially true in domains such as eldercare, education, space, and search-and-rescue robotics, in which alternate interfaces or interaction techniques may be difficult for users to use due to cognitive or physical limitations. Approximately 40% of wheelchair users, for example, find it difficult or impossible to use a standard joystick, making natural language an attractive modality for interaction and control.

My research investigates how intelligent robots can communicate through natural language in realistic human-robot interaction scenarios, in which knowledge is uncertain, incomplete, and decentralized. To do so, I draw on techniques and concepts from artificial intelligence, psychology, linguistics, and philosophy, and engage in both algorithm development and empirical experimentation.

In this dissertation defense, I will present a set of cognitively inspired algorithms I have developed to allow robots to better identify the entities (e.g., objects, people, and locations) referenced in natural language by their human conversational partners, and to better infer those conversational partners’ intentions, in uncertain and open worlds. I will then discuss how these algorithms have been implemented on a robotic wheelchair in order to significantly extend the state of the art of natural language enabled robot wheelchairs.