The state of practice in human-computer interaction today is the graphical user interface (GUI) or direct manipulation interaction style. The project seeks to identify an emerging next generation of user interaction from a range of seemingly disconnected current research and to develop a framework for unifying them. These include: virtual reality, augmented reality, ubiquitous, pervasive, and handheld interaction, tangible user interfaces, lightweight, tacit, or passive interaction, perceptual interfaces, affective computing, context-aware interfaces, and speech and multi-modal interfaces. Ubiquitous computing, tangible interfaces, and the spread of computers into a wide range of products and objects are changing interacting with computers from a specialized activity segregated from daily life to becoming more and more a part of the real world. At the same time, as computers are becoming more a part of the "real world," user interfaces seem to be evolving to behave more and more like the real world, for example in virtual reality. These can be connected through the idea of "reality-based interaction" by focusing on the ways in which interfaces that are based on reality exploit users' built-in abilities. This idea provides the leverage to attempt to tie them together and define a new generation of user interfaces and to see whether this naive notion can be built into a useful theoretical framework.
The goal is to begin to develop and test a framework or theory to connect developments in next generation user interfaces. The project starts from the idea of natural or "reality-based" interfaces. These interfaces gain their strength because they exploit abilities that their users already possess. The project will thus formalize this notion of the learned knowledge vs. the "reality-based" skills needed to use a system. The project will develop it into a theoretical framework and flesh it out more formally. It will also identify some specific open issues for investigation below. It will then modify or reinvent the initial approach as needed. It will test the theory, first, by applying it against a range of published results. Next, it will devise experiments specifically for testing, where aspects of a user interface can be selectively manipulated for the experiment. The project will build and test new interaction techniques as needed for the experiments, using the workbench and infrastructure package to be developed for dissemination, as described below. The final stage of the project will design, implement, and evaluate some selected new interaction techniques representing gaps or opportunities suggested by the framework, again using and adding to the infrastructure package.
We thank Andrew Afram, Eric Bahna, Tia Bash, Georgios Christou, Audrey Girouard, Leanne Hirshfield, Michael Horn, Michael Poor, Andrew Pokrovski, Orit Shaer, Erin Solovey, and Jamie Zigelbaum, who are students and alumni of the HCI group at Tufts. We thank our collaborators Caroline Cao and Holly Taylor of Tufts, Leonidas Deligiannidis of Wentworth Institute of Technology, Hiroshi Ishii of the MIT Media Lab and the students in his Tangible Interfaces class, Sile O'Modhrain of Queen's University Belfast, and Frank Ritter of Pennsylvania State University.
We also thank the participants in our CHI 2006 workshop on "What is the Next Generation of Human-Computer Interaction?" for their thoughts and discussion about this area, which have helped us refine our work, and Ben Shneiderman in particular for discussions on this topic. And the participants in the CHI 2007 workshop on "Tangible User Interfaces in Context and Theory," organized by Alan Blackwell, George Fitzmaurice, Lars Erik Holmquist, Hiroshi Ishii, and Brygg Ullmer.
And we thank the National Science Foundation (NSF Grant No. IIS-0414389) and the Natural Sciences and Engineering Research Council of Canada for support of this research. Any opinions, findings, and conclusions or recommendations expressed in this paper are those of the authors and do not necessarily reflect the views of the National Science Foundation.