A Hierarchical Architecture for Adaptive Brain-Computer Interfacing
Mike Chung, Willy Cheung, Reinhold Scherer, Rajesh Rao and Rajesh Rao
Brain-computer interfaces (BCIs) allow a user to directly controldevices such as cursors and robots using brain signals. Non-invasiveBCIs, e.g., those based on electroencephalographic (EEG) signals fromthe scalp, suffer from low signal-to-noise ratio which limits thebandwidth of control. Invasive BCIs allow fine-grained control but canleave users exhausted because control is exerted on a moment-by-momentbasis. In this paper, we address these problems by proposing a newadaptive hierarchical architecture for brain-computer interfacing. Theproposed approach allows a user to teach the BCI new skillson-the-fly; these learned skills are later invoked directly ashigh-level commands, relieving the user of tedious lower-levelcontrol. We report results from four subjects who used a hierarchicalEEG-based BCI to successfully train and control a humanoid robot in avirtual home environment. Gaussian processes were used for learninghigh-level commands, allowing a BCI, for the first time, to switchbetween autonomous and user-guided modes based on currentuncertainty. We also report the first instance of multi-tasking in aBCI, involving simultaneous control of two different devices by asingle user. Our results suggest that hierarchical BCIs can provide aflexible and robust way of controlling complex robotic devices inreal-world environments.