Seeing through a Robot’s Eyes Helps Those with Profound Motor Impairments An interface system that uses augmented reality technology could help individuals with profound motor impairments operate a humanoid robot to feed themselves and perform routine personal care tasks such as scratching an itch and applying skin lotion. The web-based interface displays aIndustrial Robotic eye view” of surroundings to help users interact with the world through the machine. The system, described March 15 in the journal PLOS ONE, could help make sophisticated robots more useful to people who do not have experience operating complex robotic systems. Study participants interacted with the robot interface using standard assistive computer access technologies — such as eye trackers and head trackers — that they were already using to control their personal computers. The paper reported on two studies showing how such “robotic body surrogates” – which can perform tasks similar to those of humans – could improve the quality of life for users. The work could provide a foundation for developing faster and more capable assistive robots. “Our results suggest that people with profound motor deficits can improve their quality of life using robotic body surrogates,” said Phillip Grice, a recent Georgia Institute of Technology Ph.D. graduate who is first author of the paper. “We have taken the first step toward making it possible for someone to purchase an appropriate type of robot, have it in their home and derive real benefit from it.” Grice and Professor Charlie Kemp from the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University used a PR2 mobile manipulator manufactured by Willow Garage for the two studies. The wheeled robot has 20 degrees of freedom, with two arms and a “head,” giving it the ability to manipulate objects such as water bottles, washcloths, hairbrushes and even an electric shaver.