New Way Of Training Robotic Arms

New Way Of Training Robotic Arms

image provided by pixabay

This post is also available in: heעברית (Hebrew)

Graduate students in CMU’s Mechanical Engineering Department hope to revolutionize the world of robotic mechanical precision through artificial intelligence models.

To do this, the researchers recreated the simple task of picking up a block using a virtual reality simulation, then used this to augment different “human-like” examples of the movements to aid the robot’s learning.

“If I want to show you how to do a task, I just have to do it once or twice before you pick up on it,” says Ph.D. candidate Abraham George. “So it’s very promising that now we can get a robot to replicate our actions after just one or two demos. We have created a control structure where it can watch us, extract what it needs to know, and then perform that action.”

According to Techxplore, the team discovered that the examples helped speed up the robot’s learning time for the task compared to a machine-learning architecture alone. This research method, paired with the collection of human data through a VR headset simulation, has the potential to produce promising results with “under a minute of human input.”

George explained the challenge of creating reliable augmented examples for the AI to learn from so that it could recognize more nuanced differences in the same movements. He says it is like being able to recognize what a “dog” is from pictures of various breeds, after being trained on just one picture of a dog.

Fellow Ph.D. candidate Alison Bartch said that looking forward she plans to use similar methods to teach the robot how to interact with a more malleable material, like clay, and predict how it will shape them. She explains that to integrate robots into our world better, they need to be able to predict how different materials are going to behave.