Personalized Face and Gesture Analysis for Innovations in Education and Rehabilitation

September 19, 2019
3:00 PM
Halligan 102
Speaker: Margrit Betke, Boston University
Host: Fahad Dogar


AI opens a new world of possibilities for computer systems to look at people and interpret their motions. In this talk, I discuss my team's research in AI, in particular, computer vision and human computer interfaces, that facilitates advances in education and rehabilitation. One of our projects is to predict the learning outcomes of students from a stream of video capturing the students faces as they work with a virtual tutoring system on a set of math problems. Analyzing the subtle difference in facial behavior with neural networks, we can predict the success or failure of a student’s attempt to answer a question while the student has just begun to work on the problem. Equipping a tutoring system with the ability to interpret affective signals from students has the potential to improve the learning experience of students by providing timely interventions, including appropriate affective reactions via the virtual tutor. I will also discuss related projects -- how facial expressivity can be analyzed with context-sensitive classifiers, how gestures can be recognized early, when they are only partially observed by the computer vision system, and how depth cameras can be used to support physical exercising at home.


Margrit Betke is a Professor of Computer Science and Data Science Fellow at Boston University, where she co-leads the Artificial Intelligence Research Initiative and the Image and Video Computing Research Group. She conducts research in computer vision, human-computer interfaces, human computation, medical image analysis, and application of machine learning and has published over 150 original research papers. She earned her Ph.D. degree in EECS at MIT in 1995 and received the NSF Faculty Early Career Development Award in 2001 for developing "Video-based Interfaces for People with Severe Disabilities." She co-invented the "Camera Mouse," an assistive technology used worldwide by children and adults with severe motion impairments. She is an Associate Editor of the journals IEEE Trans. on Pattern Analysis and Machine Intelligence (TPAMI) and Computer Vision and Image Understanding (CVIU). After completing a large NSF research project on developing intelligent tracking systems that reason about the group behavior of people, animals, and cells, she now leads a large NSF project on designing analytic methods for studying visual and textual public information, including news and social media.