Researchers from Georgia Institute of Technology have developed a web Interface that uses augmented reality technology could help individuals with profound motor impairments operate a humanoid robot to feed themselves and perform routine personal care tasks such as scratching an itch and applying skin lotion.
The web-based interface displays a “robot’s eye view” of surroundings to help users interact with the world through the machine.
This system could help make sophisticated robots more useful to people who do not have experience operating complex robotic systems. Study participants interacted with the robot interface using standard assistive computer access technologies — such as eye trackers and head trackers — that they were already using to control their personal computers.
“Our results suggest that people with profound motor deficits can improve their quality of life using robotic body surrogates,”
Phillip Grice, Researcher.
The web-based interface shows users what the world looks like from cameras located in the robot’s head. Clickable controls overlaid on the view allow the users to move the robot around in a home or other environment and control the robot’s hands and arms. When users move the robot’s head, for instance, the screen displays the mouse cursor as a pair of eyeballs to show where the robot will look when the user clicks. Clicking on a disc surrounding the robotic hands allows users to select a motion. While driving the robot around a room, lines following the cursor on the interface indicate the direction it will travel.
Building the interface around the actions of a simple
News Source: https://www.eurekalert.org/pub_releases/2019-03/giot-sta031519.php