Company Logo
  • fibremap_940x230.jpg
  • intro1b_940x230.jpg
  • intro2_940x230.jpg
  • intro3b_940x230.jpg
  • intro4_940x230.jpg

A Learning from Demonstration Framework for Manipulation Tasks

This paper presents a Robot Learning from Demonstration (RLfD) framework for teaching manipulation tasks in an industrial environment: the system is able to learn a task performed by a human demonstrator and reproduce it through a manipulator robot. An RGB-D sensor acquires the scene (human in action); a skeleton tracking algorithm extracts the useful information from the images acquired (positions and orientations of skeleton joints); and this information is given as input to the motion re-targeting system that remaps the skeleton joints into the manipulator ones. After the remapping, a model for the robot motion controller is retrieved by applying first a Gaussian Mixture Model (GMM) and then a Gaussian Mixture Regression (GMR) on the collected data. Two types of controller are modeled: a position controller and a velocity one. The former was presented in [10] inclusive of simulation tests, and here it has been upgraded extended the proves to a real robot. The latter is proposed for the first time in this work and tested both in simulation and with the real robot. Experiments were performed using a Comau Smart5 SiX manipulator robot and let to show a comparison between the two controllers starting from natural human demonstrations.
Research areas:
  • Uncategorized
Type of Publication:
In Proceedings
Book title:
in ISR/Robotik 2014; 41st International Symposium on Robotics; Proceedings of
1 - 7
Hits: 524

Powered by Joomla!®. Valid XHTML and CSS