Actions

Robotics Class 2011/Assignment 3

From HacDC Wiki

Write a ROS node that subscribes to the "/face_coords" topic and uses the information provided in that topic to move the robot base by publishing messages on the "/cmd_vel" topic. The goal of the assignment is to build a face tracker that attempts to move the robot to keep faces centered in the camera frame.

First, you must install gazebo and be able to run the simple gazebo empty world launch script. The simple command to start gazebo is:

roslaunch gazebo_worlds empty_world.launch

It may help to follow the ROS instructions here in order to install and get gazebo running.

The next step will be to make sure you have the gazebo_erratic_plugins. These are extensions to gazebo to support differential drive robots (like the iRobot Create) that have two driven wheels. If you installed the ROS installation using Synaptic, you can search Synaptic for "erratic" and you should see a package named ros-diamondback-erratic-robot. You will want to install this package through Synaptic. If you compiled from source, you will want to check out, rosdep install, and rosmake the erratic_robot package.

Once that is complete, you can proceed to checking out the HacDC robot simulation:

cd to where you store your downloaded ROS packages svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description rosmake irobotron_description

Once that completes, you can start the robot in the simulation (make sure you have done the roslaunch gazebo_worlds... step above before doing this):

roslaunch irobotron_description create_mobile_base.launch

You should then see the robot get inserted into the world. At this point, the robot is up and running in the simulation and you can do a rostopic list to see a variety of message topics. The robot simulator has a camera being simulated that has characteristics similar to the camera on the actual robot. You can subscribe to the simulated robot's camera stream the same way you have done in the past:

rosrun image_view image_view image:=/stereo/left/image_rect

Of course this world is not terribly interesting as it is completely empty. You can add some excitement by using the floating_faces package available in the HacDC ROS repository:

cd to where you store your downloaded ROS packages svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/floating_faces rosmake floating_faces

Once that is built, you can then launch the floating faces into the world: roslaunch floating_faces faces.launch

Once the faces are in the simulation, it would be useful to be able to manually drive the robot around before trying to control it with a controller. You can do that by checking out the teleop_twist_keyboard ROS package and building it. Once it is built, you can start it by typing:

rosrun teleop_twist_keyboard teleop_twist_keyboard.py

You can following the onscreen instructions on how to use it. But it should drive the robot around the virtual world. It is advisable to slow the robot commands down by pushing the "z" key a few times, so that the linear speed is around 0.2. This has been found to be a reasonable maximum speed for linear motion with this particular robot. If you are subscribed to the /stereo/left/image_rect image stream, you should be able to drive around and see faces.