The Electrical Neuroimaging Group has developed a Brain-computer Interface (BCI) prototype based on external flashing devices and steady-state visual evoked potentials. A virtual wheelchair control is our first application. The wheelchair simulator was developed at the Katholieke Universiteit Leuven (Belgium).Our BCI has a high transfer rate and is robust to environmental noise. The BCI can operate outside of the lab as shown in public demonstration. The subject can divert his attention from his task to listen to people or look at something else.
The blue square represents the wheelchair simulator and the circle (command display) behind contains information about the commands received from the external element , i.e., joystick, keyboard or BCI. If there is nothing on the circle then that means that it is fully controlled by the artificial intelligence (obstacle avoidance, etc).The limits and obstacles of the scene (e.g. walls) are represented in brown, dark blue or grey.
The video One (24 Mb !!) shows one of our first real-time control. The virtual wheelchair moves forward at a constant speed. Focusing his attention to one of both flashes, the subject can turn left or right at any time. This BCI aims at offering patients an economic system with minimal training and attentional requirements without resorting to complex obtacle avoidance systems.
The video Two (2 Mb) shows our apparition on the Swiss TV. English translation here. The shows featured also partial results of the BCI developped at IDIAP-EPFL (Jose Millan and coll.) in the framework of the European project MAIA and the Swiss NCCR IM2.BMI. Made by a Swiss journalist, this is probably one of the few public demonstrations contrasting a BCI system based on overt visual attention (Transient SSVEP) and a BCI system based on motor imagery which failed to work because "the presence of the cammera perturbs the operator". A second journalist reported similar problems about this BCI system in www.nouvo.ch/115-3.read.m1349454 , i.e., “the control is only possible in very quite environments without any perturbation" (quotes denote approximative translation from French).
The following example (video Three) shows that an intelligent virtual wheelchair (with an obstacle avoidance agent as the one used on MAIA demonstrations) can navigate through a complex enviroment without ANY human contribution.
This example (video Four) shows that an intelligent virtual wheelchair is able to enter in a dead-end and find its way out without any human intervention. This confrims that initial positions and direction are rather irrelevant and that simulations using BCI with artificial intelligence says nothing about the real capabilities of the BCI alone.
The fact that intelligent wheelchairs needs no BCI to guide them is indeed an standard result for real wheelchairs as you can see from other groups here. Under these conditions it is clear that shared automomy systems that have not evaluated their BCI alone cannot be taken seriously.
Premiere in Europe:
This video Five (6
minutes !) demonstrates
the last version of the Geneva BCI control in virtual environment
(robot simulator) and for the first time in Europe (April 6, 2009) the
real time control of a real robot 2000 Km away (i.e.
TELEPRESENCE ) via internet (video not accelerated!). It is an
asynchronous system based on 8 electrodes and less
than 6 minutes training
to control a 4 commands robot. Thus, the subject (not the computer !)
can send commands to the robot at any time with a 500 milliseconds
pace, that means up to two commands in one second !.
Back to TOP
Back to TOP
Back to TOP