Company Logo
  • fibremap_940x230.jpg
  • intro1b_940x230.jpg
  • intro2_940x230.jpg
  • intro3b_940x230.jpg
  • intro4_940x230.jpg

Uncategorised

Abstract

To retrieve the 3D coordinates of an object in the robot workspace is a fundamental capability for industrial and service applications. This can be achieved by means of a camera mounted on the robot end-effector only if the hand-eye transformation is known. The standard calibration process requires to view a calibration pattern, e.g. a checkerboard, from several different perspectives. This work extends the standard approach performing calibration pattern localization and hand-eye calibration in a fully automatic way. A two phase procedure has been developed and tested in both simulated and real scenarios, demonstrating that the automatic calibration reaches the same performance level of a standard procedure, while avoiding any human intervention. As a final contribution, the source code for an automatic and robust calibration is released.

 

Software

Here you can find the ROS project AutoHandEye.
Unzip the following file: AutoHandEye.zip
Our repo with detailed instructions will arrive soon.

 

Licence

The software is freely available for academic use.
For questions about the tool, please contact This email address is being protected from spambots. You need JavaScript enabled to view it.

 

Reference

@inproceedings{antonello2017autohandeye,
  title={A Fully Automatic Hand-Eye Calibration System},
  author={Antonello, Morris and Gobbi, Andrea and Michieletto, Stefano and Ghidoni, Stefano and Menegatti, Emanuele},
  booktitle={SUBMITTED TO Mobile Robots (ECMR), 2017 European Conference on},
  year={2017},
}

The IASLAB-RGBD Fallen Person Dataset consists of several RGB-D frame sequences containing 15 different people. It has been acquired in two different laboratory environments, the Lab A and Lab B. It can be divided into two parts: the former acquired from 3 static Kinect One V2 placed on 3 different pedestals ; the latter from a Kinect One V2 mounted on our healthcare robot prototype, see "An Open Source Robotic Platform for Ambient Assisted Living" by M. Carraro, M. Antonello et al in AIRO at AI*IA 2015.

 

Both parts are briefly described in the following. They contain the training/test splits of our approach to detect fallen people.

STATIC DATASET:

  • Folder "raw": 360 RGB frames and point clouds with the camera calibrations;
  • Folder "segmented_fallen_people": point clouds of the fallen people. They have been manually segmented;
  • Folder "training_with_cad_room_and_nyudv2": random selected positives (70%), 24 point clouds from the Lab A and 31 point clouds from the NYU Depth Dataset V2 by Silberman et al;
  • Folder "test_with_lab_room": random selected positives (30%) and 32 point clouds from the Lab B.

DYNAMIC DATASET:

  • Folder "training": 4 ROS bags with 15932 RGB-D frames in total acquired during 4 robot patrollings of the Lab A;
  • Folder "test": 4 ROS bags with 9391 RGB-D frames in total acquired during 4 robot patrollings of the Lab B. This room is more similar to an apartament: spaces are smaller, it is cluttered and contains a sofa;
  • Folder "maps": 2D maps of the two environments and ground truth positions of the person centroids in the maps.

Download links:

StaticDataset

DynamicDatasetPart1

DynamicDatasetPart2

 

In the figures below, some RGB samples from both environments are reported:

 1487936496.882732   1487934554.500764   1487935479.169314   1487935432.170219

 

For questions and remarks directly related to the IASLAB-RGBD Fallen Person Dataset, please contact This email address is being protected from spambots. You need JavaScript enabled to view it. and This email address is being protected from spambots. You need JavaScript enabled to view it..

Header

IT+Robotics is a spin-off company of the University of Padua. It was created in 2005 by professors working in the field of robotics and young, brilliant people coming from the information engineering department of the University of Padua.

The mission is the technology transfer from University to business. Innovation boosted by cutting edge technology, that was a prerogative of academia only few years ago, is the right way towards growth, and will let us to contrast the current crisis.

The IT+Robotics team, a mix of scientists and developers, is able to provide advanced solutions to the problems, thanks to the experience gained during years of research and industrial development in the field of autonomous robotics: real-time operating systems, artificial vision systems, software agents management, and highly realistic simulations.

IAS-Lab stands for Intelligent Autonomous Systems Laboratory and it is one of the 28 laboratories of the Department of Information Engineering of the University of Padua.

The activity at IAS-Lab concerns the study of several fields of Robotics. Our research cover the areas of computer vision, humanoids and wheeled robot programming, virtual simulation, biomechanical model based on human biosignal and wearable robots design for rehabilitation purposes.




Powered by Joomla!®. Valid XHTML and CSS