AI Helps Robots Manipulate Objects with Their Whole Bodies

Author photo: Chantal Polsonetti
ByChantal Polsonetti
Category:
Company and Product News

Work funded in part by the US National Science Foundation (NSF) and reported in IEEE Transactions on Robotics reveals that engineers have developed a method for robots to perform contact-rich tasks with simpler computation.  AI Helps RobotsAccording to the NSF, robots still fall far short of humans in routine physical tasks like carrying a bulky box in their arms. The results reported in this paper are said to provide a big step toward closing that gap.

Robots typically struggle to plan full-body manipulations versus human’s ability to use their entire body:  hands, legs, torso, etc., to complete a motion or lifting task. Whenever a contact is made or broken, the mathematical equations that describe the robot's motion are suddenly restarted. Thus, the movements corresponding to each possible pattern of contacts must be calculated independently, quickly leading to an intractable computing task.

An artificial intelligence method called "reinforcement learning" was used to plan contact-rich manipulations, smoothing out the sudden changes in the dynamic equations caused by contact. However, this process still requires many different outcomes to be calculated.

Now, by smoothing only specific contact-sensitive parts of the model equations, researchers at the Massachusetts Institute of Technology have discovered how to achieve the effects of reinforcement learning without the need to compute large numbers of full trajectories.  

While still in its early stages, this method could enable factories to use smaller, mobile robots that can manipulate objects with their entire arms or bodies. In addition, the technique could allow robots sent on space exploration missions to adapt to the environment quickly, using only an onboard computer.

The researchers designed a simple model that focuses on core robot-object interactions. They combined their model with an algorithm that can rapidly and efficiently search through all possible decisions the robot could make. With this combination, the computation time was cut down to about a minute on a standard laptop.

The researchers tested their approach in simulations where robotic hands were given tasks like moving a pen, opening a door, or picking up a plate. In each instance, the model-based approach achieved the same performance as other techniques, but in a fraction of the time. The investigators saw similar results when they tested their model on real robotic arms.

In the future, the researchers plan to enhance their technique so that it can plan dynamic motions such as throwing a ball or other object while imparting a high spin.

Engage with ARC Advisory Group

Representative End User Clients
Representative Automation Clients
Representative Software Clients