Company Logo
  • fibremap_940x230.jpg
  • intro1b_940x230.jpg
  • intro2_940x230.jpg
  • intro3b_940x230.jpg
  • intro4_940x230.jpg

 

logoMbzirc

 

 

The Mohamed Bin Zayed International Robotics Challenge (MBZIRC) is an international robotics competition, held every two years. MBZIRC provides an ambitious and technologically demanding set of challenges, and is open to all teams from all countries. MBZIRC aims to inspire the future of robotics through innovative solutions and technological excellence.

MBZIRC 2017 consist of three challenges and a triathlon type Grand Challenge. IAS-Lab is one of the 46 teams selected in 143 application received from 35 countries. IAS-Lab takes part in the Challenge 2 that requires an unmanned ground vehicle (UGV) with an onboard manipulator to operate a valve stem placed on a panel. To solve the task the robot must

identify the appropriate tool to use, pick it up, and manipulate it to rotate the valve stem one full circle.

 

More information can be found in the web site http://www.mbzirc.com/#

 

Our team (the DESERT LION) includes:


1. Enrico Pagello. Full Professor. His research interests include the application of Arti cial Intelligent to robotics with regard to multi-robot systems, cognitive robotics, motion planning and robot pro-gramming languages.
Role: Principal Investigator.


2. Emanuele Menegatti. Associate Professor. His research interests are in the eld of omnidirectional and distributed vision systems, industrial robot vision, and RGB-D vision algorithms for mobile robots.
Role: 2D and 3D robot perception.


3. Stefano Ghidoni. Assistant Professor. His main research interests are on deep learning and cooperative sensors, body pose estimation and people re-identi cation systems in camera networks.
Role: Image processing and visual servoing.


4. Matteo Munaro. Post-doctoral Research Fellow. His research interests focus on people detection, RGB-D sensors and features, human action recognition, 3D reconstruction for quality inspection.
Role: Team coordination and robot vision.


5. Roberto Bortoletto. Post-doctoral Research Fellow. His research interests include neuromuscular human-machine interfaces, brain machine interfaces, computational intelligence and machine learning.
Role: Navigation, modeling and simulation.

 

To enhance the educational process of our School of Engineering, we have included also the following people to collaborate to the team activities of the MZBIRC Project:

Elisa Tosello, a Ph.D. Student, who is working on motion planning among movable obstacles, and manipulation and grasping tasks;
Marco Carraro, a Ph.d. Student, who is working on localization and mapping for the movable platform;
Morris Antonello, a Ph.d. Student, who is working on RGB-D data processing;
Enrica Rossi, a Post-Master Fellow, who is focusing on investigating hybrid control law for under-actuated robotic platforms;
Nicola Bagarello and Silvia Gandin, Master Students, who are working on the panel inspection, wrench recognition and grasping tasks in fulfilment of the requirements for their Master theses;
Matteo Tessarotto, Alex Badan, Leonardo Pellegrina, Riccardo Fantinel, and Luca Benvegnù, all Master Students, who are addressing some of the tasks required to control the robot, as part of the Autonomous Robotics master course.

 

 

GENERAL APPROACH

We will take part to the Challenge 2 in fully autonomous mode (i.e., no human supervision). Our approach aims to effectively combine 2D and 3D perception.

From an operative point of view, the work has been splitted into six Work Packages (WPs), as follows:

  1. WP-1: Robot building, calibration and modeling
  2. WP-2: Panel localization within the arena
  3. WP-3: Valve stem localization and measuring
  4. WP-4: Selection of the right tool
  5. WP-5: Tool grasping
  6. WP-6: Valve manipulation

ADOPTED HARDWARE AND SOFTWARE

We will use an outdoor mobile robotic platform (Fig. 1) composed of a mobile robot Summit XL HL (Robotnik Automation S.L.L (http://www.robotnik.eu/)). The mobile base has been modified in order to install, on its top, a Universal Robot UR5 manipulator arm (Universal Robots A/S (http://www.universal-robots.com/)), equipped with a Robotiq Adaptive Gripper with 3 fingers (Robotiq (http://robotiq.com/)).

 

                                                 (a) 3D model of the robot                                              (b) Adaptive Gripper

 

Figure 1:A snapshot of the robot model, in which it is possible to see the mecanum wheels (indoor omnidirectional configuration, not used for the challenge) and the conventional wheels (outdoor skid-steering configuration, used for the challenge); and the 3-Finger adaptive gripper that will be used as end-effector of the manipulator arm.

 

All the aforementioned hardware is fully compatible with the Robot Operating System (ROS (http://www.ros.org/)) open-source middleware.

Furthermore, we have developed a simulation of the Challenge 2 within the Virtual Robot Experimental Platform (V-REP) simulator (Coppelia Robotics (http://www.coppeliarobotics.com/)).

All the code has been implemented in C++. The Git repository, with which we are managing and sharing the code among the team members, is currently maintained in Bitbucket.

VIDEO ON PROGRESS TO DATE

https://youtu.be/zu2Zb_LYlCo

 

SPONSORS:
We wish to thanks the companies which are supporting our team with donations of their high performance equipments:
- NVidia for Jetson TX1 (https://developer.nvidia.com/embedded/buy/jetson-tx1-devkit)
- SICK for the laser range finders (https://www.sick.com/de/en/detection-and-ranging-solutions/2d-laser-scanners/lms1xx/lms151-10100/p/p141840)

 

 




Powered by Joomla!®. Valid XHTML and CSS