Technical Reports
Display by Author: A
| B
| C
| D
| E
| F
| G
| H
| I
| J
| K
| L
| M
| N
| O
| P
| Q
| R
| S
| T
| U
| V
| W
| X
| Y
| Z
TR-2012-01
Real-time fNIRS Brain Input for Enhancing Interactive Systems |
|
Authors: | Solovey, Erin Treacy |
Date: | 2012-01-20 |
Pages: | 130 |
Download Formats: | [PDF] |
Most human-computer interaction (HCI) techniques cannot fully capture the richness of the user's thoughts and intentions when interacting with a computer system. For example, when we communicate with other people, we do not simply use words, but also accompanying cues that give the other person additional insight to our thoughts. At the same time, several physiological changes occur that may or may not be detected by the other person. When we communicate with computers, we also generate these additional signals, but the computer cannot sense such signals, and therefore ignores them. Detecting these signals in real time and incorporating them into the user interface could improve the communication channel between the computer and the human user with little additional effort required of the user. This communication improvement would lead to technology that is more supportive of the user's changing cognitive state. Such improvements in bandwidth are increasingly valuable, as technology becomes more powerful and pervasive, while our cognitive abilities do not change considerably. In this dissertation, I explore using brain sensor data as a passive, implicit input channel that expands the bandwidth between the human and computer by providing supplemental information about the user. Using a relatively new brain imaging tool called functional near-infrared spectroscopy (fNIRS), we can detect signals within the brain that indicate various cognitive states. This device provides data on brain activity while remaining portable and non-invasive. This research aims to develop tools to make brain sensing more practical for HCI and to demonstrate effective use of this cognitive state information as supplemental input to interactive systems. First, I explored practical considerations for using fNIRS in HCI research to determine the contexts in which fNIRS realistically could be used. Secondly, in a series of controlled experiments, I explored cognitive multitasking states that could be classified reliably from fNIRS data in offline analysis. Based on these experiments, I created Brainput, a system that learns to identify brain activity patterns occurring during multitasking. It provides a continuous, supplemental input stream to an interactive human-robot system, which uses this information in real time to modify its behavior to better support multitasking. Finally, I conducted an experiment to investigate the efficacy of Brainput and found improvements in performance and user experience. |
Faculty: for help posting a technical report please visit the User Guide.