After testing an iPhone, an iPad and an eye-tracking device as possible user interfaces to maneuver our research car named MadeInGermany, we now also use brain-power. Neuro-signals are acquired using the commercial Emotiv EEG (electroenzephalogram) tool. After a few rounds of mental training with the virtual objects in the software-toolkit, the bioelectric signals measured by the wireless neuroheadset are interpreted as patterns which are associated with directions. Once a pattern can be linked with a command a software interface sends these to the Drive-By-Wire System of the car which turns the messages into actuation like steering or acceleration. Our test-drives on the former airport in Berlin Tempelhof showed that there is only a slight delay between the intended command and the actual reaction of the car. The “BrainDriver” application is, of course, a demonstration and not roadworthy yet but on the long run, human-machine interfaces like this could bear huge potential in combination with autonomous driving. For example when it comes to decide which way you want to take on a crossroad while the autonomous cab drives you home.

MadeInGermany controlled via the Emotiv EPOC Brain Computer Interface:

[nggallery id=9]

More images of our autonomous cars can be found in the Gallery section.