Computer Vision Techniques for Analysis of Human Gesture
I will give an overview of our research effort to develop computer vision systems for automatic recognition of gestural communication. Examples will be given in two domains: American Sign Language (a very rich, and challenging domain), and hand signals (easier) like those employed by referees at sporting events, flight-directors on airport runways, etc. I will describe a variety of approaches for localizing and tracking human hands, estimating hand pose and upper body pose, as well as methods for efficiently spotting, comparing, and recognizing specific upper-body and hand gestures of interest in video streams. This is collaborative research with Profs. Margrit Betke and George Kollios (BU Computer Science), Prof. Vladimir Pavlovic (Rutgers), and Prof. Carol Neidle (ASL linguist, BU Modern Foreign Languages and Literatures), and BU graduate students Joni Alon, Vassilis Athitsos, Rui Li, Ashwin Thangali, Tai-peng Tian, and Quan Yuan.