Conversational Agents Embedded in the World
Conversational Agents Embedded in the World: Integrating Visible Humans and Invisible Computers Mobile networks and embedded processors increasingly allow computation to suffuse all of the spaces in which we work and play. These smart environments and intelligent rooms will put at our disposal a vastly expanded inventory of information, without requiring us to learn special command languages to access data. The designers of such "invisible computers" describe them as ways for people to interact with computation "as they interact with another person". In this talk, however, I will agree with Harry Potter that one should "never trust anything that can think for itself, if you can't see where it keeps its brain". I'll argue that humans need to locate intelligence, and that this need poses problems for the invisible computer. Bodies are the best possible example of located intelligence, of course; I will demonstrate the use of embodiment with a series of interactive systems I have implemented, including embodied conversational agents, literacy systems for children, and some new work on "shared reality" -- a paradigm in which human and computer share a real physical space within which to make hand gestures, facial displays and body movements, and share real physical objects that can be passed back and forth between the real and virtual world. But, at a more fundamental level, I will claim that neither embodied systems nor invisible computers will ever succeed unless we understand the "affordances" of the body -- that is, how the body works in face-to-face dialogue, in situating intelligence -- and understand the needs of human users.