Display by Author: A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z
Implicit Brain-Computer Interaction Applied to a Novel Adaptive Musical Interface
|Authors:||Yuksel, Beste F.; Afergan, Daniel; Peck, Evan M.; Griffin, Garth; Harrison, Lane; Chen, Nick W.B.; Chang, Remco; Jacob, Robert J.K.|
We present a novel brain-computer interface (BCI) integrated with a musical instrument that adapts passively to users’ changing cognitive state during musical improvisation. Traditionally, musical BCIs have been divided into camps: those that use some mapping of brainwaves to create audio signals; and those that use explicit brain signals to control some aspect of the music. Neither of these systems take advantage of higher level semantically meaningful brain data or implicit brain data which could be used in adaptive systems. We present a new type of real-time BCI that assists users in musical improvisation by adapting to users’ measured cognitive workload. Our system advances the state of the art in this area in three ways: 1) We demonstrate that cognitive workload can be classified in real-time while users play the piano using functional near-infrared spectroscopy. 2) We build a real-time system using this brain signal that musically adapts to what users are playing. 3) We demonstrate that users prefer this novel musical instrument over other conditions and report that they feel more creative.
Faculty: for help posting a technical report please visit the User Guide.