Robert J.K. Jacob
Dept. of Computer Science
177 College Avenue
Medford, MA 02155 U.S.A.
Office: Cummings 357
The HCI lab is in the new Joyce Cummings Center right on campus at 177 College Avenue. The lab is on the third floor, room 362.
Robert Jacob is a Professor of Computer Science at Tufts University, where his research interests are new interaction modes and techniques and user interface software; his current work focuses on implicit brain-computer interfaces. He has been a visiting professor at the University College London Interaction Centre, Universite Paris-Sud, and the MIT Media Laboratory. Before coming to Tufts, he was in the Human-Computer Interaction Lab at the Naval Research Laboratory. He received his Ph.D. from Johns Hopkins University, and he is a member of the editorial board for the journal Human-Computer Interaction and a founding member for ACM Transactions on Computer-Human Interaction. He has served as Vice-President of ACM SIGCHI, Papers Co-Chair of the CHI and UIST conferences, and General Co-Chair of UIST and TEI. He was elected as a member of the ACM CHI Academy in 2007 and as an ACM Fellow in 2016.
The current focus in my research group is on a new generation of "implicit" brain-computer interfaces. They allow a computer to obtain and act on auxiliary inputs from its user without requiring explicit user action or attention. Brain-computer interaction has made dramatic progress, but its main application to date has been for physically disabled users. Our work in real-time measurement and machine learning classification of functional near infrared spectroscopy (fNIRS) brain data is allows to create, use, and study new kinds of implicit user interfaces based on brain measurement.
We are using brain input as a way to obtain more information about the user and their context in an effortless and direct way from their brain activity. We then use it to adapt the user interface in real time. We are creating and studying these new user interfaces, with emphasis on domains where we can measure their efficacy.
We are now also broadening this work to include other forms of lightweight, passive, real-time adaptive implicit interaction, based on physiological or other measurements. Our focus continues to be on the design of subtle and effective implicit interfaces that make judicious use of the measurements we can obtain.
Reality-based Interaction: Understanding the Next Generation of User Interfaces
Tangible Programming for children
TUIMS: Tangible User Interface Management System
(Some photos courtesy of Dan Afergan and Hyejin Im)