“Robot manipulation is the next shoe to drop,” says Robert Platt, computer science professor and head of the Helping Hands robotics lab at Northeastern. “Imagine a robot that can do things with it’s hands in the real world—anything from defusing a bomb to doing your laundry. This has been a dream in the research community for decades, but now we’re finally getting to the point where it could actually happen.”
Recent progress in robot perception, machine learning and Big Data took us on the threshold of a huge leap in the ability of robots to perform fine motor tasks and act in uncontrolled environments, says Platt. The difference is between robots that can perform repetitive tasks in a structured lab environment and a new era of humanoid robots that can do significant work in the real world.
Why motor skills are left behind
There is an irony in robots and artificial intelligence known as the Moravek’s paradox: what is difficult for humans is relatively easy for robots, and what is easy for humans it is almost impossible for robots.
We can program a robot with the computational capability of defeating an international chess champion, but we find it hard to give it the dexterity of a 2-year-old child. Identifying and grasping a pencil in a random pile of stationery is nearly impossible for a robot and letting him open a door and walking into a room -as demonstrated in a recent international robot contest – can result in a comedy film. See video.
Humans have evolved their visual, sensory and motor skills for millions of years: these complex abilities are so deeply rooted in the human circuitry that we perform them unconsciously. Instead, endeavors such as mathematics, science or financial analysis are recent efforts to humans and are so much easier to replicate for engineers.
Despite the enormous challenge, Platt argues that autonomous robots are ready to make a big jump in their ability to manipulate unfamiliar objects.
For example, Platt and his team at the Helping Hands Lab trained a robot to find, catch and remove unfamiliar objects from a heap in disarray with a 93 percent accuracy. Achieving this goal requires significant advances in machine learning, perception and control.
The researchers used a technique called reinforcement learning in which the robot learns through trial and error. They created a simulated world in which the robot could practice to collect and manipulate objects in virtual reality. When the robot did what researchers wanted – grabbing an object from a heap – it has been given a reward. This technique allows the robot to develop skills in a virtual environment and then apply them to the real world.
While the vision is a great tool to drive large movements, fine motor skills require the sense of touch.
“Think of what you can do with gloves on,” explains Platt. “You can open the garage door, grab a shovel, and clear the driveway. But if you need to unlock the garage first, you need to take your gloves off to insert the key.”
Platt’s laboratory demonstrated these new features by grabbing a USB connector and inserting it into a port. While this may not seem like a big result, it is a critical step toward making robots that can perform precise manipulation tasks like changing the battery on a mobile phone.
As with any nascent human progress – radar, telephone, internet – practical applications of robot dexterity are hard to predict. These are some assumptions:
Platt’s Helping Hands Lab – in collaboration with the University of Massachusetts and New Hampshire Crotched Mountain Rehabilitation Department – is building a wheelchair with a robotic arm that can grab objects around the house or run simple household tasks. This could allow elderly or people with disabilities continue to live independently in their own homes.
Engineering professor Taskin Padir and his team received a grant from the Department of Energy to adapt NASA’s Valkyrie robot for hazardous waste disposal. There are over a dozen sites spread throughout the United States where radioactive waste was buried in tunnels during the cold war. For autonomous robots can locate, seize and store this waste in secure containers, they will need fine motor skills and the ability to work in unfamiliar surroundings.
Funded by a grant from the National Science Foundation, Engineering Professor Peter Whitney is working with researchers at Stanford University to create a robot that can perform MRI-guided surgery.
“Robots that work flawlessly in the lab break down quickly when they’re placed in unfamiliar situations,” says Platt. “Our goal is to develop the underlying algorithms that will allow them to be more reliable in the real world. Ultimately, this will fundamentally change the way we think about robots, allowing them to become partners with humans rather than just machines that work in far away factories.”