I n this video I demonstrate the use of an assembly of technologies that I put together that enables me to exert mind control over the telepresence robot that I have demonstrated previously (itself an assembly of a range of technologies).
The key addition to the earlier assembly and demonstration of the remote telepresence robot (which can be found here: http://youtu.be/lJTU3fLoZuY) is the use of an Emotiv EPOC headset, which is able to sense, record, and transmit information concerning “brain waves” or EEG signals from my brain.
Using this I am able to direct my attention to particular areas of the computer screen and, by concentrating and training the Emotiv software to associate particular brain states with particular actions such as left-clicking the mouse, I am able to use my mind to control the little robot – whose “eyes and ears” I can see and hear out of via the streaming video feed displayed on the computer.
I have only recently started training with the Emotic EPOC headset and this form of mind control is still very difficult – it is a lot harder than it looks in this video. There were occasions where quite a bit of time would elapse between successful movements – you can see the transitions between each movement of the robot, which I have edited because some of them were nearly a minute long.
So there is much more training required before I could claim “competent” control with fluid and responsive action of the robot from my thoughts. But, that being said, while it is rudimentary, as demonstrated in the video it still works and is a good little proof-of-concept.
If you are going to attempt this yourself you should be prepared for some occasional frustration!