Navigating a smart wheelchair with a brain-computer interface interpreting steady-state visual evoked potentials

C Mandel, T L�th, T Laue, T R�fer…�- 2009 IEEE/RSJ�…, 2009 - ieeexplore.ieee.org
C Mandel, T L�th, T Laue, T R�fer, A Gr�ser, B Krieg-Br�ckner
2009 IEEE/RSJ International Conference on Intelligent Robots and�…, 2009ieeexplore.ieee.org
In order to allow severely disabled people who cannot move their arms and legs to steer an
automated wheelchair, this work proposes the combination of a non-invasive EEG-based
human-robot interface and an autonomous navigation system that safely executes the
issued commands. The robust classification of steady-state visual evoked potentials in brain
activity allows for the seamless projection of qualitative directional navigation commands
onto a frequently updated route graph representation of the environment. The deduced�…
In order to allow severely disabled people who cannot move their arms and legs to steer an automated wheelchair, this work proposes the combination of a non-invasive EEG-based human-robot interface and an autonomous navigation system that safely executes the issued commands. The robust classification of steady-state visual evoked potentials in brain activity allows for the seamless projection of qualitative directional navigation commands onto a frequently updated route graph representation of the environment. The deduced metrical target locations are navigated to by the application of an extended version of the well-established nearness diagram navigation method. The applicability of the system proposed is demonstrated by a real-world pilot study in which eight out of nine untrained subjects successfully navigated an automated wheelchair, requiring only some ten minutes of preparation.
ieeexplore.ieee.org