User controlling quadcoptor with an EEG interface (via Zhejiang University)
Those who have difficulty moving may soon have help they can control. Researchers, led by Gang Pan, from Zhejiang University’s Pervasive Computing Group at CCNT Lab have designed a way of controlling a quadcopter with the mind without being a Jedi. The research group’s goal is to give the handicapped the ability for increased interactivity with their environments. Known as the ‘FlyingBuddy2’ system, the design consists of readily available components that include an Emotive EEG Neuro-headset that features a gyroscope (for camera and cursor control) and 14 saline sensors (for spatial resolution) to pick-up brain activity at key locations on the users head.
The signals are sent to a laptop, either through a USB connection or wirelessly using Bluetooth, and are analyzed and interpreted in real-time using specially designed software (unknown as to which software). The software is calibrated to the individuals thought patterns, which are then sent wirelessly to a Parrot AR.Drone quadcopter that’s outfitted with a camera that the user can use to take pictures or video of their surroundings. Piloting the FlyingBuddy2 is done through thought with thinking ‘left-hard’ for take-offs and landings, thinking ‘left-lightly’ to rotate the craft, thinking ‘right’ to fly forward, thinking ‘push’ to gain altitude and ‘clenching-teeth’ (???) to descend. The camera takes continuous video, which the user can take a snap-hot of just by blinking four times in succession.
While the FlyingBuddy2 system can give the disabled a unique vantage point to observe their surroundings (and also be used for a fighting-type flying game), it’s the hopes of the researchers to one day adapt the technology to one day give the disabled or impaired individuals the ability to fly real aircraft or drive automobiles. That day is closer than ever.
Cabe
