Workshop Proposal
What is the Next Generation of Human-Computer Interaction?

Robert J.K. Jacob

Tufts University
Medford, Mass., USA

Workshop Topic

Is there an emerging next generation of human-computer interaction or rather simply “a thousand points of light” of disparate and unrelated innovative new developments? This workshop will bring together researchers in a range of emerging new areas of HCI to look for common ground and a common understanding of a next generation of user interfaces. If we consider command-line interfaces as the first generation, then direct manipulation and the graphical user interface defined a distinct new generation of user interfaces that is still the state of practice today. Unlike the early days of graphical user interfaces, research in HCI today is developing on many fronts, making the next generation more difficult to connect and define. Yet, much current research appears to be moving away from the screen based GUI, in a related general direction.

They key components of this next generation are found in a variety of loosely-related current research areas in HCI—or, more broadly, interaction design or human-information design:

As described further in the Extended Abstract, we will use as a starting point for discussion the notion of natural or realistic or reality-based interfaces as a thread to connect new developments in HCI. This notion focuses on the ways in which the new interfaces increasingly draw their strength from exploiting the user's pre-existing skills and expectations from the real world. Current research at Tufts is pursuing this and will provide input to start the discussion.

Potential for Discussion and Expected Interest

To date, few researchers have addressed this issue explicitly, but several have discussed sub-areas and made contributions toward it. People who have attempted to explain or organize these new styles of user interfaces have tended to concentrate more on individual classes or groups of new interfaces than on concepts that unify the classes. The time is ripe to start a discussion that connects such work. For example (see Extended Abstract for citations), Ullmer and Ishii provide a framework for tangible interfaces; Fishkin, Moran, and Harrison propose the concept of embodied interfaces; Bellotti, Back, Edwards, Grinter, Henderson, and Lopes define sensing interfaces, and raise a set of key problems; Beaudouin-Lafon's Instrumental Interaction model sheds new light on post-WIMP interfaces; and Nielsen defined the notion of non-command interfaces. (Some of these authors are also likely workshop participants, and some have already contacted about this research.)

The workshop will welcome researchers working in areas such as those listed above (virtual and augmented reality, ubiquitous, pervasive, and handheld interaction, tangible user interfaces, etc.) and in particular:

Process and Plan—Before the Workshop

In addition to posting the Call for Participation, we will contact computer scientists and interaction design researchers working on next generation interfaces and psychologists, cognitive scientists, and observers of the HCI scene who are likely to be interested in this topic. We will solicit four-page position papers, which may describe ongoing work, recent results, or opinions and approaches to the problem. Papers should show potential contributions to the workshop goals, such as interaction designs or ideas toward new conceptual frameworks or theories. Papers will be peer-reviewed and 15-20 will be selected according to their relevance to the workshop and the likelihood that they will stimulate and contribute to the discussion.

We will provide a web site at before the workshop. All selected papers will be made available to the participants on this site, along with other workshop information and some background readings to start discussion. The web site will be set up before the workshop and maintained after it, to help the participants and this emerging community keep in touch. We are also pursuing this area at Tufts, under an NSF grant, and teaching a course in Fall 2005 on “Reality-based Interaction: Understanding the Next Generation of User Interfaces,” which will provide new work as input to the workshop.

Process and Plan—Workshop Schedule

We propose a one-day workshop, with six working hours, excluding the breaks. Our goal is to facilitate discussion more than lecture presentation.

However, experience suggests that participants often feel the need to give a talk, and they will find a way to do so regardless of the format and schedule! We will handle this tendency by including time for everyone to give a 5-minute talk at the beginning. The talks are planned to end with the first coffee break (if it can be modified as noted below), to provide a clear dividing line. As a backup, we have also scheduled a 60-minute buffer after the coffee break; it ends with a hard break (lunch), which should keep things on track.

After lunch we will move entirely to interactive discussion. As described in the Extended Abstract, this will begin with reality-based interaction, which we will then discuss, agree or disagree with, or make counterproposals to.

0:00 - 0:15
Introduction (times given in working hours, excluding breaks)

0:15 - 2:00
Brief talk from each participant

Break (if possible, move coffee break to here rather than at 1:30 to provide incentive to finish the participant talks.)

2:00 - 3:00
Plenary discussion of reality-based interaction (this can be compressed in the likely event that the participant talks run over)


3:00 - 3:30
Plenary kickoff for afternoon working session

3:30 - 4:30
Breakout groups or discussion:
Depending on the mix of participants, we will decide whether to break out into two or three groups or remain as a single group. The groups would be divided in one of three ways: by discipline (developers of new interfaces, cognitive psychologists); by their position with respect to reality-based interaction (supports, disagrees with, presents alternative to); or by creating parallel teams to develop 2-3 different alternate approaches to the framework.


4:30 - 5:30
Report back by each group—as well as by individual ideas discussed, if there are several different approaches per group

5:30 - 6:00
Plenary discussion of future work, research agenda, and plans for follow-on activities

Process and Plan—After the Workshop

We plan to prepare a poster for the CHI conference summarizing the ideas produced by the workshop as well as a report for the SIGCHI Bulletin or other appropriate venue. We will continue to maintain the web site at to serve both the participants and the broader community developing around this topic. If the position papers and workshop discussion reflect sufficient progress and cohesiveness, we will work toward producing a special issue or section of a journal or possibly an edited book. However, it is more likely that this would be an outcome of a second workshop or small conference on this topic, and that this one will start the discussion and form the community for it. In addition, our ongoing NSF project will serve as a nexus for continuing and collecting work in this topic well after the workshop.

Organizer's Background

Rob Jacob co-organized (with Mark Green) the frequently cited SIGGRAPH'90 Workshop on Software Architectures and Metaphors for Non-WIMP User Interfaces. He also has a range of conference organizing experience including serving as Papers Chair of CHI and UIST, various other conference committee roles, and as SIGCHI Vice-Chair For Conferences. His research focuses on new interaction techniques and media, including eye movement-based interaction, virtual reality, lightweight interfaces, and tangible interfaces. His current work investigates reality-based interaction as a framework for understanding the next generation of HCI.

Call for Participation

Is there an emerging next generation of human-computer interaction or simply "a thousand points of light" of disparate and unrelated new developments? This workshop will bring together researchers in a range of emerging new areas of HCI to look for common ground and a common understanding of a next generation of interfaces, after the command-line and GUI generations. Key research areas include: virtual and augmented reality; ubiquitous, pervasive, and handheld interaction; tangible interfaces; lightweight, tacit, passive, or non-command interaction; perceptual interfaces; affective computing; context-aware interfaces; ambient interfaces; embodied interfaces; sensing interfaces; and speech and multi-modal interfaces.

We seek to tie them together intellectually with unifying ideas, frameworks, and theories that provide common ground for discussing, analyzing, connecting, inventing, comparing, and making predictions about emerging new interaction styles and interface designs as well as to identify gaps or opportunities for a future research agenda from holes or “sweet spots” in a new taxonomy. To start discussion concretely, we will use the notion of reality-based interfaces, which focuses on the ways interfaces draw strength from exploiting users' pre-existing skills and expectations from the real world.

We invite computer scientists and interaction design researchers working on next generation interfaces and psychologists, cognitive scientists, and observers of HCI. Submit a 4-page position paper about your ongoing work, new interaction designs, opinions or approaches to the problem, or conceptual frameworks or theories. Papers will be peer-reviewed and 15-20 selected by relevance and likelihood of stimulating and contributing to this discussion. Email your paper, in PDF, to with subject "CHI 2006 Workshop Submission" See for more information.

Cover Sheet and Technical Requirements

Workshop Title

What is the Next Generation of Human-Computer Interaction?


One-day CHI 2006 workshop

Contact Information for Primary Organizer

Robert J.K. Jacob
Department of Computer Science
Tufts University
161 College Avenue
Medford, Mass. 02155 USA

Phone: 617-627-3217
Fax: 617-627-3220

List of Technology Support

Nice, but not required:

List of Other Support

Nice, but not required: