Company Logo
  • fibremap_940x230.jpg
  • intro1b_940x230.jpg
  • intro2_940x230.jpg
  • intro3b_940x230.jpg
  • intro4_940x230.jpg

In this lab experience we'll introduce a humanoid robot, the Robovie-X, and the use of an XBOX 360 Kinect sensor to record some human movements that will be remapped into and reproduced by the robot. On one hand, a Kinect sensor will be used for the data acquisition, on the other hand rviz and Gazebo tools are used to reproduce the human poses recorded through the robot. 

The objective is to make the robot crouch and grasp an object in a fixed position in front of it by using the human movements captured by the Kinect sensor.

You have to look at the robot stability in order to avoid robot falls during the task.

 robovie_x_bottle    20130514 102847

At the end of the experience...

Objectives

  • Make the robot grasp the object without falling down
  • Use a bag file to reproduce a recorded action
  • Understand the basic ROS concepts (tf)
  • Use the gyroscope to improve the robot performances

Plus

  • Object Oriented (OO) approach
  • Grasp several objects

Challenges

  • Maintain the robot stable on an inclined plane
  • Some people are able to grasp a rigid rectangular box by teleoperating the robot

What we want

  • Code (BitBucket)
  • Video (YouTube)
  • bag file (BitBucket)
  • Report (PDF using BitBucket) containing
    • The algorithm overview
    • The procedures to let the robot being stable
    • What are the main robot problems and why
    • The reletionship between tfs description

Setup and Kinect tests

Download the necessary packages by following the instructions:

cd ~/Workspace/ros/catkin_ws/src
git clone https://<your_bb_username>@bitbucket.org/iaslab-unipd/magic_folder.git
cd magic_folder
git checkout dev_catkin
cd ..
git clone https://<your_bb_username>@bitbucket.org/iaslab-unipd/nite2tf.git
cd nite2tf
git checkout catkin_dev
cd ..
git clone https://<your_bb_username>@bitbucket.org/iaslab-unipd/robovie_x.git 
cd ..
catkin_make --force-cmake -G"Eclipse CDT4 - Unix Makefiles"
roslaunch nite2tf openni_tracker.launch

 

You should get something like:

...
frame id = camera_rgb_optical_frame
rate = 30
file = /home/stefano/projects/hydro/catkin_ws/src/nite2tf/config/tracker.xml
InitFromXml... ok
Find depth generator... ok
[ INFO] [1398727023.676153725]: Number devices connected: 1
[ INFO] [1398727023.676248024]: 1. device on bus 003:09 is a SensorV2 (2ae) from PrimeSense (45e) with serial id 'A00364820345039A'
[ INFO] [1398727023.677115461]: Searching for device with index = 1
[ INFO] [1398727023.684043104]: Opened 'SensorV2' on bus 3:9 with serial number 'A00364820345039A'
Find user generator... ok
Register to user callbacks... ok
Register to calibration start... ok
Register to calibration complete... ok
StartGenerating... ok
[ INFO] [1398727023.968492718]: rgb_frame_id = '/camera_rgb_optical_frame' 
[ INFO] [1398727023.968546920]: depth_frame_id = '/camera_depth_optical_frame' 
[ WARN] [1398727023.971202068]: Camera calibration file /home/stefano/.ros/camera_info/rgb_A00364820345039A.yaml not found.
[ WARN] [1398727023.971249470]: Using default parameters for RGB camera calibration.
[ WARN] [1398727023.971300004]: Camera calibration file /home/stefano/.ros/camera_info/depth_A00364820345039A.yaml not found.
[ WARN] [1398727023.971341276]: Using default parameters for IR camera calibration.
...

Now, if a person goes in front of the camera, a new user is identified. In order to calibrate the system and track the user with respect to the Kinect, the person have to move a little bit.

If all run correctly you see something like:

...
New User 1
Calibration started for user 1
1398727037 Calibration complete, start tracking user 1

Visualize the tracked person by starting the :

rosrun rviz rviz

and setup some basic properties:

  • Fixed frame: /camera_rgb_optical_frame
  • Add: pointCloud2
    • Topic: /camera/rgb/points
    • ColorTransform: RGB8
    • Style: points
  • Add: TF

The result should be similar to:

Tracking



Robovie-X

You can also visualize the Robovie-X model and teleoperate it by using:

roslaunch robovie_x_teleoperation robovie_x_teleoperation.launch

You have to become practical with the GUI, the robot model and to explore the mechanism control underlying the movement.

The Robovie-X model is controlled by using the joint_state_publisher script through the sensor_msgs/JointState:

Header header
string[] name
float64[] position
float64[] velocity
float64[] effort

The real robot could be activeted and controlled by typing:

rosrun robovie_x_controller robovie_controller

rosbag

Finally, the rosbag package is a tool for recording and playing back ROS topics. You should use it to record tf and any other information you could need. The main information about this ROS package can be founded at rosbag Wiki




Powered by Joomla!®. Valid XHTML and CSS