Watch: Robot Being Controlled, Corrected With Brainwaves And Hand Gestures
If we want to see robots thrive in different fields, it is important to establish appropriate ways or techniques that can be used to control or correct them whenever required.
Normally, engineers prefer advanced programming or language processing techniques for a task like that, but all those methods do not provide enough flexibility, especially when there are multiple tasks at hand.
This is why a group of researchers at MIT’s Computer Science and Artificial Intelligence Laboratory developed a new system, one that allows robots to be controlled by our brains and gestures.
Though the idea of controlling a machine with our mind may sound too farfetched, the researchers demonstrated the system successfully and are actually bringing it closer to real-world applications.
The idea, as they described, revolves around detecting error-related potentials or those specific brainwaves that occur naturally when we see an error in a task we already know. For instance, if a robot is making a mistake while performing a given task and its supervisor has noted it, error-related potentials or ErrPs are automatically generated in the person’s brain.
As and when this happens, the system detects the brain signals — through a series of electrodes placed on the scalp — and commands the robot to stop right away, giving the supervisor an opportunity to correct its action.
This is exactly where the second part of the process begins. In order to guide the robot, the person makes certain hand gestures, which are detected by a muscle-activity measuring interface and transmitted to the system. On receiving this signal, the system activates the robot again and prompts it to make the right move.
“By looking at both muscle and brain signals, we can start to pick up on a person's natural gestures along with their snap decisions about whether something is going wrong,” Joseph DelPreto, the lead researcher behind the new system, said in a statement. “This helps make communicating with a robot more like communicating with another person.”
In the video shown below, the research team demonstrated the capabilities of the system by tasking a robot named Baxter, from Rethink Technologies, to move a drill over to three possible targets on a mockup board. As the robot made a mistake in the selection of the target, the combination of brain signals and hand gestures from the person sitting beside guided it to select the correct option. This is way, the robot was able to pick the right target 70 to 97 percent of the cases during the test.
The team said the system can revolutionize how the robot workers are controlled and managed, but also added with appropriate developments in the future, a tech like this could even support workers with language and physical disorders or limited mobility.
“What’s great about this approach is that there’s no need to train users to think in a prescribed way,” DelPreto added. “The machine adapts to you, and not the other way around.”
© Copyright IBTimes 2024. All rights reserved.