By monitoring brain activity, the system can detect in real-time if a person notices an error as a robot does a task. Using an interface that measures muscle activity, the person can then make hand gestures to scroll through and select the correct option for the robot to execute.
This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback.By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.
In most previous work, systems could generally only recognize brain signals when people trained themselves to “think” in very specific but arbitrary ways and when the system was trained on such signals. For instance, a human operator might have to look at different light displays that correspond to different robot tasks during a training session.
The power of brain signals called “error-related potentials” (ErrPs), which researchers have found to naturally occur when people notice mistakes. If there’s an ErrP, the system stops so the user can correct it; if not, it carries on.
For the project the team used “Baxter,” a humanoid robot from Rethink Robotics. With human supervision, the robot went from choosing the correct target 70 percent of the time to more than 97 percent of the time.
To create the system the team harnessed the power of electroencephalography (EEG) for brain activity and electromyography (EMG) for muscle activity, putting a series of electrodes on the users’ scalp and forearm.Merging the two, however, allows for more robust bio-sensing and makes it possible for the system to work on new users without training.
By looking at both muscle and brain signals, we can start to pick up on a person’s natural gestures along with their snap decisions about whether something is going wrong.This helps make communicating with a robot more like communicating with another person.The team says that they could imagine the system one day being useful for the elderly, or workers with language disorders or limited mobility.
News Source: http://news.mit.edu/2018/how-to-control-robots-with-brainwaves-hand-gestures-mit-csail-0620
Related videos:
This ‘Brain-to-Text’ system can turn your Thoughts into Text
Take a Picture just by thinking about it, using Google Glass with MindRDR App.
Watch More Robot & Drones videos at our YouTube channel Qualitypointtech