Emotions in Engineering: Uncovering Structure in Emotional Speech

March 28, 2013
2:50 pm - 4:00 pm
Halligan 111

Abstract

Abstract: Emotion has intrigued researchers for generations. This fascination has permeated the engineering community, motivating the development of affective computational models for classification. However, human emotion remains notoriously difficult to interpret both because of the mismatch between the emotional cue generation (the speaker) and cue perception (the observer) processes and because of the presence of complex emotions, emotions that contain shades of multiple affective classes. Proper representations of emotion would ameliorate this problem by introducing multidimensional characterizations of the data that permit the quantification and description of the varied affective components of each utterance.

My work seeks to provide a computational account of how humans perceive emotional utterances. I leverage perception estimation studies to develop systems capable of interpreting naturalistic expressions of emotion and to create novel quantification measures. This area has applications in the design of affective avatars, the development of novel machine learning algorithms, and in furthering our scientific understanding of human emotion expression.

In this talk I will discuss Emotion Profiles and Emotograms, quantitative measures expressing the degree of the presence or absence of a set of basic emotions within an expression. They avoid the need for a hard-labeled assignment by instead providing a method for describing the shades of emotion present in an utterance. These profiles can be used to determine a most likely assignment for an utterance, to map out the evolution of the emotional tenor of an interaction, or to interpret utterances that have multiple affective components. The Emotion-Profile technique is able to accurately identify the emotion of utterances with definable ground truths (emotions with an evaluator consensus) and is able to interpret the affective content of emotions with ambiguous emotional content (no evaluator consensus), emotions that are typically discarded during classification tasks. I will present results detailing the construction, application, and benefit of this representation paradigm.

Bio: Emily Mower Provost received her B.S. in Electrical Engineering (summa cum laude and with thesis honors) from Tufts University, Boston, MA in 2004 and her M.S. and Ph.D. in Electrical Engineering from the University of Southern California (USC), Los Angeles, CA in 2007 and 2010, respectively.

Emily is a member of Tau-Beta-Pi, Eta-Kappa-Nu, and a member of IEEE and ISCA. She has been awarded the National Science Foundation Graduate Research Fellowship (2004-2007), the Herbert Kunzel Engineering Fellowship from USC (2007-2008, 2010-2011), the Intel Research Fellowship (2008-2010), and the Achievement Rewards For College Scientists (ARCS) Award (2009 – 2010). Her research interests are in human-centered speech and video processing and multimodal interfaces design. The goals of her research are motivated by the complexities of human emotion generation and perception.