Company Logo
  • fibremap_940x230.jpg
  • intro1b_940x230.jpg
  • intro2_940x230.jpg
  • intro3b_940x230.jpg
  • intro4_940x230.jpg

Robotica Autonoma

1. Install Ubuntu 14.04 64bit

Download and install the ISO from here.

2. Install ROS Indigo

2.1 Download

Download the Installation script: ros_install.sh

2.2 Installation

Run the Installation script

cd ~/Downloads
./ros_install.sh

3. System Setup

3.1 Set USB permits

sudo groupadd lego
sudo usermod -a -G lego `id -u -n`
echo 'SUBSYSTEM=="usb", ATTRS{idVendor}=="0694", GROUP="lego", MODE="0660"
SUBSYSTEM=="usb", ATTRS{idVendor}=="1962", GROUP="lego", MODE="0660"' \
| sudo tee /etc/udev/rules.d/99-lego.rules
sudo restart udev

Now log out and log back in to finish. A more detailed version of these instructions are at NXT-Python.

3.2 Setup OpenNI + NITE

The OpenNI framework is an open source SDK used for the development of 3D sensing middleware libraries and applications. In order to setup this library type

cd ~/Downloads
wget http://robotics.dei.unipd.it/images/teaching/NITE-Bin-Linux-x64-v1.5.2.21.tar.zip
unzip NITE-Bin-Linux-x64-v1.5.2.21.tar.zip
tar -xjvf NITE-Bin-Linux-x64-v1.5.2.21.tar.bz2
cd NITE-Bin-Dev-Linux-x64-v1.5.2.21
sudo ./uninstall.sh
sudo ./install.sh

3.3 Setup libhid

libhid is a user-space HID access library written in C. We use it to control a connected Robovie-X. In order to setup this library install the deb packages you can find here or type

sudo apt-get install libusb-dev
cd ~/Downloads
wget --tries=10 http://alioth.debian.org/frs/download.php/1958/libhid-0.2.16.tar.gz
tar xzf libhid-0.2.16.tar.gz
cd libhid-0.2.16
./configure --enable-werror=no
make
sudo make install
sudo ldconfig

3.4 Setup ROS

Now we create two separate workspaces to use both catkin and rosbuild. See this tutorial for a more complete explanation.

source /opt/ros/hydro/setup.bash
mkdir -p ~/workspace/ros/catkin/src
cd ~/workspace/ros/catkin
catkin_make --force-cmake
echo "source ~/workspace/ros/catkin/devel/setup.bash" >> ~/.bashrc

4. Programming tools

4.1 Eclipse

The Eclipse package we're going to install is an IDE for C/C++ developers with Mylyn integration.

Download Eclipse from this link or check for the latest version (Luna) at www.eclipse.org/downloads.

We first need Java (Oracle or OpenJDK is the same).

If it is not installed, i.e

java -version

raises some errors, type

sudo apt-get purge openjdk*
sudo add-apt-repository ppa:webupd8team/java
sudo apt-get update
sudo apt-get install oracle-jdk7-installer

Then, you install Eclipse

cd ~/Downloads
tar -xzf eclipse-cpp-*
sudo mv eclipse /opt
sudo chown -R root:root /opt/eclipse

Finally setup Ubuntu/Unity.

sudo ln -s /opt/eclipse/eclipse /usr/local/bin
echo '[Desktop Entry]
Name=Eclipse
Type=Application
Exec=bash -i -c "eclipse"
Terminal=false
Icon=/opt/eclipse/icon.xpm
Comment=Integrated Development Environment
NoDisplay=false
Categories=Development;IDE' | sudo tee /usr/share/applications/eclipse.desktop

To completely remove Eclipse use the following commands.

sudo rm -rf /opt/eclipse
sudo rm -f /usr/local/bin/eclipse
sudo rm -f /usr/share/applications/eclipse.desktop

During this experience you have to plan the motion of a NAO robot in a 2D simulated environment populated by obstacles. The robot has to walk through a prefix path avoiding collisions with other objects around it.

You can also try to challenge yourself in different new scenarios: 3D environments, dynamic maps, real robots.

At the end of the experience...

Objectives

  • Make the simulated robot walk around the environment without collision
  • Understand the 2D map, and build a new map

Plus

  •     Object Oriented (OO) approach
  •     Use a 3D map (see OctoMap)
  •     Use a dynamic map
  •     Try with a real robot and a real map

Challenges

  • Walk in a populated environment

What we want

  • Code (BitBucket)
  • Video (YouTube or BitBucket)
  • Report (PDF using Moodle) containing
    • A short description of your attemps

Step 1: Download and install NaoQi

It permits to control the robot.

cd
git clone https://<your_bb_username>@bitbucket.org/iaslab-unipd/naoqi.git
sudo mv naoqi /opt/

Insert the path in your .bashrc:

export NAOQIPATH=/opt/naoqi/naoqi-sdk-1.14.2-linux64
export PYTHONPATH=\(PYTHONPATH:/opt/naoqi/pynaoqi-python-2.7-naoqi-1.14-linux64

Step 2: Download and install the humanoid and NAO metapackages

These metapackages contain respectively packages for humanoid navigation and NAO integration in ROS.

 sudo apt-get install ros-hydro-octomap-ros ros-hydro-octomap-msgs ros-hydro-sbpl
cd ~/Workspace/ros/catkin_ws/src
git clone https://github.com/ahornung/humanoid_msgs
git clone https://github.com/ahornung/humanoid_navigation
cd humanoid_navigation
git checkout hydro-devel
cd ..
git clone https://github.com/ros-nao/nao_robot.git

Pay particular attention to the package footstep_planner of this stack: it permits to build a path for the feet of the robot testing the collision of the soles with the ground.

Step 3: Download and install UNIPD NAO model and motion utilities

This metapackage extends the standard functionalities provived in the previously installed packages.

cd ~/Workspace/ros/catkin_ws/src
git clone https://<your_bb_username>@bitbucket.org/iaslab-unipd/nao_unipd.git
cd ..
catkin_make --force-cmake -G"Eclipse CDT4 - Unix Makefiles"

humanoid_stack

Step 4: Make the NAO walk

In order to test the installed packages, you should run the NaoQi driver and then launch a preloaded example program.

In a terminal: 

cd \)NAOQIPATH
./naoqi -b 127.0.0.1

 

Then open another terminal:

roslaunch nao_motion footstep_navigation.launch

 

Move your NAO by using first 2D Pose Estimate and then 2D Nav Goal.

Navigation Example

Hint: Octomap

Try to have a look at the Octomap package for ROS and use it to build the 3D map of the environment. This library implements a 3D occupancy grid mapping approach by providing data structures and mapping algorithms.

The 3D map of the envitonment built using Octomap

In this lab experience we'll introduce a new sensor, based on computer vision. The robot is given the same task of experience 2, that is, reaching a goal on a map with moving obstacles, but this time your system will be driven by an omnidirectional vision sensor.

You are asked to develop a module capable of finding the white lines of the map, and evaluate the distance of the robot to each of them. Moreover, you should also be able to recover the robot orientation with respect to such lines: this turns out to be a very useful feature, since it lets you adjust the robot trajectory very quickly.

After you developed the requested module, you will see that its output is completely different from what was provided by the IR sensor. This will push you to modify the robot behavior, in order to exploit the new, higher level data provided. This can be either an easy or difficult task, depending on how you designed your software in experience 2. If your design is modular, you will just need to drop some function calls, otherwise...

omni image

At the end of the experience...

Objectives

  • Calibrate the omnidirectional camera by means of OCamCalib
  • Exploit the omnidirectional camera: extract information using computer vision algorithms
  • Exploit calibration data to evaluate 3D information
  • Make the robot reach the goal or report if it is not reachable (as in experience 2)

Plus

  • Reuse your code as much as possible
  • Organize camera acquisition into a separate ROS module
  • Detect obstacles and measure their distance to the robot
  • Use the gyroscope to improve the robot performances

Challenges

  • Develop your first computer vision algorithms
  • Handle calibration

What we want

  • Code (BitBucket)
  • Video (YouTube) -- including a video of the output of your computer vision algorithms
  • Report (PDF using BitBucket) containing at least
    • The algorithm overview
    • Description of all steps of your vision algorithm
    • The role of each node you developed
    • The three nodes interaction
    • The parameter file/launcher description
    • The CMakeLists.txt description

In this lab experience we'll introduce a humanoid robot, the Robovie-X, and the use of an XBOX 360 Kinect sensor to record some human movements that will be remapped into and reproduced by the robot. On one hand, a Kinect sensor will be used for the data acquisition, on the other hand rviz and Gazebo tools are used to reproduce the human poses recorded through the robot. 

The objective is to make the robot crouch and grasp an object in a fixed position in front of it by using the human movements captured by the Kinect sensor.

You have to look at the robot stability in order to avoid robot falls during the task.

 robovie_x_bottle    20130514 102847

At the end of the experience...

Objectives

  • Make the robot grasp the object without falling down
  • Use a bag file to reproduce a recorded action
  • Understand the basic ROS concepts (tf)
  • Use the gyroscope to improve the robot performances

Plus

  • Object Oriented (OO) approach
  • Grasp several objects

Challenges

  • Maintain the robot stable on an inclined plane
  • Some people are able to grasp a rigid rectangular box by teleoperating the robot

What we want

  • Code (BitBucket)
  • Video (YouTube)
  • bag file (BitBucket)
  • Report (PDF using BitBucket) containing
    • The algorithm overview
    • The procedures to let the robot being stable
    • What are the main robot problems and why
    • The reletionship between tfs description

Setup and Kinect tests

Download the necessary packages by following the instructions:

cd ~/Workspace/ros/catkin_ws/src
git clone https://<your_bb_username>@bitbucket.org/iaslab-unipd/magic_folder.git
cd magic_folder
git checkout dev_catkin
cd ..
git clone https://<your_bb_username>@bitbucket.org/iaslab-unipd/nite2tf.git
cd nite2tf
git checkout catkin_dev
cd ..
git clone https://<your_bb_username>@bitbucket.org/iaslab-unipd/robovie_x.git 
cd ..
catkin_make --force-cmake -G"Eclipse CDT4 - Unix Makefiles"
roslaunch nite2tf openni_tracker.launch

 

You should get something like:

...
frame id = camera_rgb_optical_frame
rate = 30
file = /home/stefano/projects/hydro/catkin_ws/src/nite2tf/config/tracker.xml
InitFromXml... ok
Find depth generator... ok
[ INFO] [1398727023.676153725]: Number devices connected: 1
[ INFO] [1398727023.676248024]: 1. device on bus 003:09 is a SensorV2 (2ae) from PrimeSense (45e) with serial id 'A00364820345039A'
[ INFO] [1398727023.677115461]: Searching for device with index = 1
[ INFO] [1398727023.684043104]: Opened 'SensorV2' on bus 3:9 with serial number 'A00364820345039A'
Find user generator... ok
Register to user callbacks... ok
Register to calibration start... ok
Register to calibration complete... ok
StartGenerating... ok
[ INFO] [1398727023.968492718]: rgb_frame_id = '/camera_rgb_optical_frame' 
[ INFO] [1398727023.968546920]: depth_frame_id = '/camera_depth_optical_frame' 
[ WARN] [1398727023.971202068]: Camera calibration file /home/stefano/.ros/camera_info/rgb_A00364820345039A.yaml not found.
[ WARN] [1398727023.971249470]: Using default parameters for RGB camera calibration.
[ WARN] [1398727023.971300004]: Camera calibration file /home/stefano/.ros/camera_info/depth_A00364820345039A.yaml not found.
[ WARN] [1398727023.971341276]: Using default parameters for IR camera calibration.
...

Now, if a person goes in front of the camera, a new user is identified. In order to calibrate the system and track the user with respect to the Kinect, the person have to move a little bit.

If all run correctly you see something like:

...
New User 1
Calibration started for user 1
1398727037 Calibration complete, start tracking user 1

Visualize the tracked person by starting the :

rosrun rviz rviz

and setup some basic properties:

  • Fixed frame: /camera_rgb_optical_frame
  • Add: pointCloud2
    • Topic: /camera/rgb/points
    • ColorTransform: RGB8
    • Style: points
  • Add: TF

The result should be similar to:

Tracking



Robovie-X

You can also visualize the Robovie-X model and teleoperate it by using:

roslaunch robovie_x_teleoperation robovie_x_teleoperation.launch

You have to become practical with the GUI, the robot model and to explore the mechanism control underlying the movement.

The Robovie-X model is controlled by using the joint_state_publisher script through the sensor_msgs/JointState:

Header header
string[] name
float64[] position
float64[] velocity
float64[] effort

The real robot could be activeted and controlled by typing:

rosrun robovie_x_controller robovie_controller

rosbag

Finally, the rosbag package is a tool for recording and playing back ROS topics. You should use it to record tf and any other information you could need. The main information about this ROS package can be founded at rosbag Wiki

Given a map composed of nxm squares, make the robot go from a Start square to a Goal square while avoiding obstacles. You can see an example of map in Figure 1.

Figure 1: Map example

The robot can only go in the north, south, east and west squares, namely it cannot move diagonally. In the first scenario the map is fixed so you know where the obstacles are. In the second scenario the map can change, so you cannot assume to know it in advance. Videos below show different working procedures to reach the goal.

{youtube}Ym7QgUD4huo{/youtube}

{youtube}1bWRLklGF_s{/youtube}

{youtube}g7NIHNkLPb0{/youtube}

{youtube}3OjqXgH8Uz0{/youtube}


At the end of the experience...

Objectives

  • Make the robot reach the goal or report if it is not reachable
  • Use the infrared sensor
  • Use the gyroscope to improve the robot performances
  • Create more then one node (at least three: Robot, Navigation, Mapping)
  • Read the map parameters from a file (.yaml) or the launcher (as args)
  • Understand the basic ROS concepts (node parameters, use of topics)

Plus

  • Use the third engine to move the ultrasonic sensor
  • Object Oriented (OO) approach
  • Draw the map (e.g. using OpenCV)
  • Assume that the obstacles are not fixed. That is, if the goal seem not to be reachable, try again paths already tested.

Challenges

  • Solve a known a priori map, given the start and goal cells
  • The map is unknown (you don't know where the obstacles are, you only know the start and goal cells)

What we want

  • Code (BitBucket)
  • Video (YouTube or BitBucket)
  • Report (PDF using Moodle) containing at least
    • The algorithm overview
    • At each step, how you choose the next cell to move in and why
    • The role of each node you developed
    • The three nodes interaction
    • The parameter file/launcher description
    • The CMakeLists.txt description

Sensors messages

nxt_msgs/Color Message

Header header
float64 intensity
float64 r
float64 g
float64 b

Given two obstacles, make the robot go around the first one and stops 3 cm behind the second one for 10 seconds and finally come back, as in Figure 1.

Figure 1

At the end of the experience...

Objectives

  • Make the robot reach the second obstacle
  • Go back to the starting position
  • Understand the basic ROS concepts (launcher, compiler and publisher/subscriber)
  • Use the gyroscope to improve the robot performances

Plus

  • Use the thrid engine to move the ultrasonic sensor
  • Object Oriented (OO) approach

Challenges

  • Unknow distance between the two obstacles

What we want

  • Code (BitBucket)
  • Video (YouTube or BitBucket)
  • Report (PDF using BitBucket) containing
    • The algorithm overview
    • The procedures to make the robot go straight and turn
    • What are the main robot problems and why
    • The Launcher description
    • The CMakeLists.txt description

Explanation

Preliminary steps

cd ~/Workspace/ros/catkin_ws/src
git clone https://<your_bb_username>@bitbucket.org/iaslab-unipd/nxt.git 
cd .. 
catkin_make --force-cmake -G"Eclipse CDT4 - Unix Makefiles"
 
roslaunch nxt_unipd nxt_lab.launch
roslaunch nxt_unipd teleop_keyboard.launch

 

Robot configuration

nxt_lab.yaml

 -  type: ultrasonic
    frame_id: ultrasonic_link
    name: ultrasonic_sensor
    port: PORT_4
    spread_angle: 0.2
    min_range: 0.01
    max_range: 2.5
    desired_frequency: 5.0

Program launcher

nxt_lab.launch

  <group ns="base_parameters">
    <param name="r_wheel_joint" value="r_wheel_joint"/>
    <param name="l_wheel_joint" value="l_wheel_joint"/>
    <param name="wheel_radius" value="0.022"/>
    <param name="wheel_basis" value="0.055"/>
    <param name="vel_to_eff" value="0.2"/>
    <param name="l_vel_to_eff" value="0.1"/>
    <param name="r_vel_to_eff" value="0.1"/>
  </group>

Robot controller

advanced_control.py

 

 

\[v_{trans}^{est} \left[i+1\right] = \frac{1}{2} \left( v_{trans}^{est}\left[i\right] + \frac{1}{2} \left( v_{rot}^{reg}\left[i,j_{wheel}^{l} \right] + v_{rot}^{reg}\left[i,j_{wheel}^{r} \right] \right) r_{wheel} \right)\]

\[v_{rot}^{est} \left[i+1\right] = \frac{1}{2} \left( v_{rot}^{est}\left[i\right] + \frac{1}{2} \left( v_{rot}^{reg}\left[i,j_{wheel}^{l} \right] - v_{rot}^{reg}\left[i,j_{wheel}^{r} \right] \right) \frac{r_{wheel}}{b_{wheel}} \right)\]

where:

\(v_{trans}^{est} \left[i\right]\) is the estimated translational velocity at the instant \(i\)

\(v_{rot}^{est} \left[i\right]\) is the estimated rotational velocity at the instant \(i\)

\(v_{rot}^{reg}\left[i,j_{wheel}^{l} \right]\) is the registered rotational velocity for the joint of the left wheel at the instant \(i\)

\(v_{rot}^{reg}\left[i,j_{wheel}^{r} \right]\) is the registered rotational velocity for the joint of the right wheel at the instant \(i\)

\(r_{wheel}\) is the wheel radius

\(b_{wheel}\) is the wheel basis

 

 

\[v_{trans}^{cmd} \left[i+1\right] = v_{trans}^{des}\left[i\right] + k_{trans} \left( v_{trans}^{des}\left[i \right] - v_{trans}^{est}\left[i +1 \right] \right)\]

\[v_{rot}^{cmd} \left[i+1\right] = v_{rot}^{des}\left[i\right] + k_{rot} \left( v_{rot}^{des}\left[i \right] - v_{rot}^{est}\left[i \right] \right)\]

where:

\(v_{trans}^{cmd} \left[i\right]\) is the translational velocity applied to the joint  at the instant \(i\)

\(v_{rot}^{cmd} \left[i\right]\) is the rotational velocity applied to the joint at the instant \(i\)

\(v_{trans}^{des}\left[i \right]\) is the desired transational velocity for the joint at the instant \(i\)

\(v_{rot}^{des}\left[i \right]\) is the desired rotational velocity for the joint at the instant \(i\)

\(k_{trans}\) is the trasational constant

\(k_{rot}\) is the rotational constant

 

 

\[F \left[i+1\right] = k_v^F \left( v_{trans}^{cmd}\left[i + 1\right] \frac{1}{r_{wheel}} - v_{rot}^{cmd}\left[i +1 \right] \frac{b_wheel}{r_{wheel}} \right)\]

where:

\(F \left[i\right]\) is the effort applied to the joint at the instant \(i\)

\(k_v^F\) is the constant to transform velocity to effort

Robot teleoperation

nxt_key.cpp

    switch(c)
    {
      case KEYCODE_L:
        ROS_DEBUG("LEFT");
        angular_ = 2.0;
        break;
      case KEYCODE_R:
        ROS_DEBUG("RIGHT");
        angular_ = -2.0;
        break;
      case KEYCODE_U:
        ROS_DEBUG("UP");
        linear_ = 0.15;
        break;
      case KEYCODE_D:
        ROS_DEBUG("DOWN");
        linear_ = -0.15;
        break;
    }

 

teleop_keyboard.launch

<node pkg="nxt_unipd" type="nxt_teleop_key" name="nxt_teleop_key"  output="screen">
  <param name="scale_linear" value="1" type="double"/>
  <param name="scale_angular" value="1" type="double"/>
</node>

Sensors messages

nxt_msgs/Range Message

Header header
float64 range
float64 range_min
float64 range_max
float64 spread_angle

1. Install Ubuntu 13.04 64bit

Download and install the ISO from here.

2. Install ROS Hydro

2.1 Setup Sources

sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu raring main" \
> /etc/apt/sources.list.d/ros-latest.list'
wget http://packages.ros.org/ros.key -O - | sudo apt-key add -

2.2 Installation

Make sure everything is up to date.

sudo apt-get update
sudo apt-get upgrade

Install all needed packages.

sudo apt-get install \
`wget http://robotics.dei.unipd.it/images/teaching/hydro.list -O - | cat -`

Initialize rosdep to easily install system dependencies.

sudo rosdep init
rosdep update

3. System Setup

3.1 Set USB permits

sudo groupadd lego
sudo usermod -a -G lego `id -u -n`
echo 'SUBSYSTEM=="usb", ATTRS{idVendor}=="0694", GROUP="lego", MODE="0660"
SUBSYSTEM=="usb", ATTRS{idVendor}=="1962", GROUP="lego", MODE="0660"' \
| sudo tee /etc/udev/rules.d/99-lego.rules
sudo restart udev

Now log out and log back in to finish. A more detailed version of these instructions are at NXT-Python.

3.2 Setup OpenNI + NITE

The OpenNI framework is an open source SDK used for the development of 3D sensing middleware libraries and applications. In order to setup this library type

cd ~/Downloads
wget http://www.openni.org/wp-content/uploads/2012/12/NITE-Bin-Linux-x64-v1.5.2.21.tar.zip
unzip NITE-Bin-Linux-x64-v1.5.2.21.tar.zip
tar -xjvf NITE-Bin-Linux-x64-v1.5.2.21.tar.bz2
cd NITE-Bin-Dev-Linux-x64-v1.5.2.21
sudo ./uninstall.sh
sudo ./install.sh

3.3 Setup libhid

libhid is a user-space HID access library written in C. We use it to control a connected Robovie-X. In order to setup this library install the deb packages you can find here or type

sudo apt-get install libusb-dev
cd ~/Downloads
wget --tries=10 http://alioth.debian.org/frs/download.php/1958/libhid-0.2.16.tar.gz
tar xzf libhid-0.2.16.tar.gz
cd libhid-0.2.16
./configure --enable-werror=no
make
sudo make install
sudo ldconfig

3.4 Setup ROS

Now we create two separate workspaces to use both catkin and rosbuild. See this tutorial for a more complete explanation.

source /opt/ros/hydro/setup.bash
mkdir -p ~/Workspace/ros/catkin_ws/src
cd ~/Workspace/ros/catkin_ws
catkin_make --force-cmake
mkdir -p ~/Workspace/ros/rosbuild_ws
rosws init ~/Workspace/ros/rosbuild_ws ~/Workspace/ros/catkin_ws/devel
echo "source ~/Workspace/ros/rosbuild_ws/setup.bash" >> ~/.bashrc

4. Programming tools

4.1 Eclipse

The Eclipse package we're going to install is an IDE for C/C++ developers with Mylyn integration.

Download Eclipse from this link or check for the latest version (actually is Kepler) at www.eclipse.org/downloads.

We first need Java (Oracle or OpenJDK is the same).

If it is not installed, i.e

java -version

raises some errors, type

sudo apt-get purge openjdk*
sudo add-apt-repository ppa:webupd8team/java
sudo apt-get update
sudo apt-get install oracle-jdk7-installer

if you have this problem:

Download done.
Removing outdated cached downloads...
sha256sum mismatch jdk-7u51-linux-x64.tar.gz
Oracle JDK 7 is NOT installed.
dpkg: error processing oracle-java7-installer (--configure):
 subprocess installed post-installation script returned error exit status 1
Setting up gsfonts-x11 (0.22) ...
Errors were encountered while processing:
 oracle-java7-installer
E: Sub-process /usr/bin/dpkg returned an error code (1)

you might solve by using the following procedure:

  1. Download the JDK here.
  2. Then go to /var/cache/oracle-jdk7-installer/
  3. In that dir remove jdk-7u51-linux-x64.tar.gz and paste the version downloaded from Oracle website.
  4. Try sudo apt-get install oracle-java7-installer again.. this time should work fine!

Then, you install Eclipse

cd ~/Downloads
tar -xzf eclipse-cpp-*
sudo mv eclipse /opt
sudo chown -R root:root /opt/eclipse

Finally setup Ubuntu/Unity.

sudo ln -s /opt/eclipse/eclipse /usr/local/bin
echo '[Desktop Entry]
Name=Eclipse
Type=Application
Exec=bash -i -c "eclipse"
Terminal=false
Icon=/opt/eclipse/icon.xpm
Comment=Integrated Development Environment
NoDisplay=false
Categories=Development;IDE' | sudo tee /usr/share/applications/eclipse.desktop

To completely remove Eclipse use the following commands.

sudo rm -rf /opt/eclipse
sudo rm -f /usr/local/bin/eclipse
sudo rm -f /usr/share/applications/eclipse.desktop

During this experience you have to plan the motion of a NAO robot in a 2D simulated environment populated by obstacles. The robot has to walk through a prefix path avoiding collisions with other objects around it.

You can also try to challenge yourself in different new scenarios: 3D environments, dynamic maps, real robots.

At the end of the experience...

Objectives

  • Make the simulated robot walk around the environment without collision
  • Understand the 2D map, and build a new map

Plus

  •     Object Oriented (OO) approach
  •     Use a 3D map (see OctoMap)
  •     Use a dynamic map
  •     Try with a real robot and a real map

Challenges

  • Walk in a populated environment

What we want

  • Code (BitBucket)
  • Video (YouTube or BitBucket)
  • Report (PDF using Moodle) containing
    • A short description of your attemps

Step 1: Download and install the humanoid_stack

This stack contains packages for humanoid (biped) navigation:

cd ~/Workspace/Groovy/rosbuild_ws
git clone git@bitbucket.org:iaslab-unipd/humanoid_stacks.git
rosws set humanoid_stacks

Pay particular attention to the package footstep_planner of this stack: it permits to build a path for the feet of the robot testing the collision of the soles with the ground.

Restart your terminal.

Step 2: Download and install SBPL

According to the upper and lower bounds of its DOF, a robot can reach certain positions from its current state. Combining together this motion primitives it is possible to build a path. SBPL (Search-Based Planning Library) has this aim.

sudo apt-get install ros-groovy-sbpl
rosmake footstep_planner

humanoid_stack

Step 3: Download and install NAO robot

cd Workspace/Groovy/rosbuild_ws
git clone git@bitbucket.org:iaslab-unipd/nao.git
rosws set nao

The package contain common tools for the Nao robot to run on the PC. It contains, other than the URDF of the robot, joint state, odometry, and so on.

Remember to restart your terminal and to rosmake the package:

rosmake nao_example

Step 4: Download and install NaoQi

It permits to control the robot.

cd
git clone git@bitbucket.org:iaslab-unipd/naoqi.git
sudo mv naoqi /opt/

Insert the path in your .bashrc:

export NAOQIPATH=/opt/naoqi/naoqi-sdk-1.14.2-linux64
export PYTHONPATH=\(PYTHONPATH:/opt/naoqi/pynaoqi-python-2.7-naoqi-1.14-linux64

Step 5: Make the NAO walk

In order to test the packages you installed, you have to run the NaoQi driver and then launch a preloaded example program.

In a terminal: 

cd \)NAOQIPATH
./naoqi -b 127.0.0.1

 

Then open another terminal:

roslaunch nao_example footstep_navigation.launch

 

Move your NAO using Navigation Goal and Navigation Pose.

Step 6: Download and install Octomap

Try to have a look at the Octomap package for ROS and use it to build the 3D map of the environment. This library implements a 3D occupancy grid mapping approach, providing data structures and mapping algorithms.

The 3D map of the envitonment built using Octomap

In this lab experience we'll introduce a humanoid robot, the Robovie-X, and the use of an XBOX 360 Kinect sensor to record some human movements that will be remapped into and reproduced by the robot. On one hand, we'll use a Kinect sensor for the data acquisition,  on the other hand we'll use rviz and Gazebo tools to reproduce the human recorded poses through the robot. 

Your objective is to make the robot crouch and grasp an object in a fixed position in front of him using your movement captured by the Kinect sensor.

You have to look at the robot stability in order to avoid robot falls during the task.

 robovie_x_bottle    20130514 102847

At the end of the experience...

Objectives

  • Make the robot grasp the object without falling down
  • Understand the basic ROS concepts (tf)

Plus

  • Object Oriented (OO) approach
  • Use several objects

Challenges

  • Grasp a rigid rectangular box

What we want

  • Code (BitBucket)
  • Video (YouTube or BitBucket)
  • Report (PDF using Moodle) containing
    • The algorithm overview
    • The procedures to let the robot being stable
    • What are the main robot problems and why
    • The reletionship between tfs description

Step 1: Setup and test Kinect and skeleton tracker

First of all, we have to download some packages.

cd ~/Workspace/Groovy/rosbuild_ws
git clone git@bitbucket.org:iaslab-unipd/action_model.git
rosws set action_model
cd ~/Workspace/Groovy/catkin_ws/src
git clone git@bitbucket.org:iaslab-unipd/magic_folder.git
cd magic_folder
git checkout dev_catkin
cd ../..
catkin_make

Action Model won't work because of a bug in the NITE libraries, so we need to download the new ones (64-bit version) from the website.

cd ~/Workspace
wget http://www.openni.org/wp-content/uploads/2012/12/NITE-Bin-Linux-x64-v1.5.2.21.tar.zip
unzip NITE-Bin-Linux-x64-v1.5.2.21.tar.zip
tar -xjvf NITE-Bin-Linux-x64-v1.5.2.21.tar.bz2
cd NITE-Bin-Dev-Linux-x64-v1.5.2.21
sudo ./uninstall.sh
sudo ./install.sh

If you have Ubuntu 32-bit, try substituting "x64" with "x86".

You can now compile and run the skeletal tracker.

source ~/.bashrc
rosmake action_model
roslaunch action_model openni_tracker.launch

If everything is correct, you should get something like the following.

...
frame id = openni_rgb_optical_frame
rate = 30
file = /opt/ros/electric/stacks/unipd-ros-pkg/ActionModel/config/tracker.xml
InitFromXml... ok
Find depth generator... ok
Find user generator... ok
StartGenerating... ok
...

Now, if a person goes in front of the camera, a new user is identify. In order to calibrate the system and track the user with respect to the Kinect, the person have to move a little bit.

If all run correctly you see:

...
New User 1
Calibration started for user 1
Calibration complete, start tracking user 1

Visualize the tracked person by starting the :

rosrun rviz rviz

and setup some basic properties:

  • Fixed frame: /openni_camera
  • Add: pointCloud2
    • Topic: /camera/rgb/points
    • ColorTransform: RGB8
    • Style: points
  • Add: TF

You should see something like this:

Tracking

This package provides a set of tools for recording from and playing back to ROS topics. You will use it to record tf trasforms.


 

DONE!!

 


Step 2: Remapping of the human joints into the correspondent ones of the Robovie-X

tf is a package that lets the user keep track of multiple coordinate frames over time. tf maintains the relationship between coordinate frames in a tree structure buffered in time, and lets the user transform points, vectors, etc. between any two coordinate frames at any desired point in time, and allows you to ask questions like:

  • Where was the head frame relative to the world frame, 5 seconds ago?
  • What is the pose of the object in my gripper relative to my base?
  • What is the current pose of the base frame in the map frame?

You can connect your movements to the Robovie-X ones using a tf listener.

Setup your workspace environment by downloading and installing the robovie_x package.

cd ~/Workspace/Groovy/rosbuild_ws
git clone git@bitbucket.org:iaslab-unipd/robovie_x.git
cd robovie_x
git checkout dev_groovy
mv robovie_x_teleoperation ~/Workspace/Groovy/catkin_ws/src
mv robovie_x_model ~/Workspace/Groovy/catkin_ws/src
cd ~/Workspace/Groovy/catkin_ws
catkin_make

Then you can launch your packages.

roslaunch robovie_x_teleoperation robovie_x_teleoperation.launch

You have to become practical with the GUI, the robot model and to explore the mechanism control underlying the movement.

You have to control the Robovie-X model into RViz by using the joint_state_publisher plugin through the sensor_msgs/JointState:

Header header
string[] name
float64[] position
float64[] velocity
float64[] effort

A node for publishing joint angles, either through a GUI, or with default values. This package publishes sensor_msgs/JointState messages for a robot. The package reads the robot_description parameter, finds all of the non-fixed joints and publishes a JointState message with all those joints defined.

The joint angles published from joint_state_publisher will be listened by the Robovie-X controller to move each non-fixed joint.

DONE!!!




Powered by Joomla!®. Valid XHTML and CSS