A Companion Robot for Modeling the Expressive Behavior of Persons for Parkinson's Disease

March 30, 2020
2:000-4:00
Halligan 102
Speaker: Andy Valenti
Host: Matthias Scheutz

Abstract

Abstract:

Emotions are crucial for human social interactions and, as such, people communicate emotions through a variety of modalities: kinesthetic (through facial expressions, body posture and gestures), auditory (through the acoustic features of speech) and semantic(through the content of what they say). Sometimes, however, communication channels for certain modalities can be unavailable (for example in the case of texting), and sometimes they can be compromised, for example, due to a disorder such as Parkinson's disease (PD) that may affect facial, gestural and speech expressions of emotions. As a result, it is not easy for caregivers to judge how PD persons are coping with their condition. They may look as if they are unfeeling, indifferent, sad or hostile and misinterpretation of their true internal state can lead to depression.

At this defense, we present a situated emotion expression framework which a robot can use to detect emotions in one modality, specifically in speech, and then express them in another modality,through gestures or facial expressions. This is part of a larger objective to develop a socially assistive robot for the social self- management of people with PD. More generally, it can be used for any conversational AI agent.

Zoom info: https://tufts.zoom.us/j/960202944

Meeting ID: 960 202 944

Password: defense330