Rob Jacob


Portrait

Robert J.K. Jacob
Dept. of Computer Science
Tufts University
Halligan Hall
161 College Avenue
Medford, MA 02155 U.S.A.

Email: Email address
WWW: http://www.cs.tufts.edu/~jacob/

Phone: 617-627-2225
Fax: 617-627-2227

(On sabbatical Fall semester 2014)



Background

Robert Jacob is a Professor of Computer Science at Tufts University, where his research interests are new interaction modes and techniques and user interface software; his current work focuses on implicit brain-computer interfaces. He is currently a visiting professor at the University College London Interaction Centre; he has also been a visiting professor at the Universite Paris-Sud and the MIT Media Laboratory. Before coming to Tufts, he was in the Human-Computer Interaction Lab at the Naval Research Laboratory. He received his Ph.D. from Johns Hopkins University, and he is a member of the editorial boards of Human-Computer Interaction and the International Journal of Human-Computer Studies and a founding member for ACM Transactions on Computer-Human Interaction. He is Vice-President of ACM SIGCHI, and he has served as Papers Co-Chair of the CHI and UIST conferences, and Co-Chair of UIST and TEI. He was elected to the ACM CHI Academy in 2007, an honorary group of the principal leaders of the field of HCI, whose efforts have shaped the discipline and industry, and have led research and innovation in human-computer interaction.


Current Research

The current focus in my research group is on a new generation of brain-computer interfaces. Brain-computer interaction has made dramatic progress in recent years, but its main application to date has been for physically disabled users. Our research in real-time measurement and machine learning classification of functional near infrared spectroscopy (fNIRS) brain data leads us to develop, use, and evaluate brain measurement as input to adaptable user interfaces for the larger population.

We are using brain input as a way to obtain more information about the user and their context in an effortless and direct way from their brain activity. We then use it to adapt the user interface in real time. We are creating and studying these new user interfaces, with emphasis on domains where we can measure their efficacy.

We are now also broadening this work to include other forms of lightweight, passive, real-time adaptive user interfaces, based on physiological or other measurements. Our focus continues to be on the design of subtle and effective interfaces that make judicious use of the measurements we can obtain.

Brain project image      Brain project image      Brain project image

[general article]    [overview chapter]    [more papers]    [project page]


Papers, Talks, etc


Other Recent Research Projects

Image from the project

Reality-based Interaction: Understanding the Next Generation of User Interfaces
[project]   [CHI paper]   [CHI workshop]

Image from the project

Tangible Programming for children
[Michael Horn]   [project]   [Robot Park] at Boston Museum of Science

Image from the project

TUIMS: Tangible User Interface Management System
[Orit Shaer]   [paper]



Courses: Spring 2015

Courses: Fall 2014

Courses: Fall 2013


Ph.D. Alumni


Links

HCI group Using head-mounted display and eye tracker Playing the Kotzschmar organ in Portland ME

(Group photo courtesy of Dan Afergan)