Robotics Class 2011/Assignment 4: Difference between revisions
From HacDC Wiki
mNo edit summary |
mNo edit summary |
||
Line 80: | Line 80: | ||
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system. For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating. To test your algorithm you'll most likely want to rotate a variety of artifacts to see how the robot responds. To start, here is an example of how to get the first artifact to rotate (very slowly): | While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system. For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating. To test your algorithm you'll most likely want to rotate a variety of artifacts to see how the robot responds. To start, here is an example of how to get the first artifact to rotate (very slowly): | ||
''' rosservice call gazebo/apply_body_wrench '{reference_point: {x: -2, y: 0, z: 0}, body_name: "alan_model::alan_link", wrench: { torque: { x: 0, y: 0 , z: 0.01 } }, duration: 1000000000 }' ''' |
Revision as of 21:18, 16 July 2011
The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse. The security guards have written a memo describing recent sightings of artifacts in the warehouse "moving by themselves". They have come to us, the robotics experts, to see if we can't provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository. However, we will make a slight adjustment to the command line for starting up gazebo. We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse. You can start gazebo in the following way:
roslaunch gazebo_worlds simple_office2.launch
Once gazebo is up and running and you next need to insert the robot into the simulation. If you have not checked out the irobotron_description ROS package, you can do so with the following command:
cd to where you store your downloaded ROS packages svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description rosmake irobotron_description
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:
roscd irobotron_description svn update rosmake irobotron_description
Once you have successfully rosmake'ed the irobotron_description, you can then insert the robot into the simulation:
roslaunch irobotron_description create_mobile_base_in_office2.launch
This command will insert the robot into the starting location in the map. It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation. You can insert them into the simulation with the following command:
roslaunch floating_faces faces_in_office2.launch
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse. There is a launch file that has been written that customizes the ROS navigation stack for our particular robot. You can look at the launch file, named move_base.launch, for more info. To start the navigation stack, type:
roslaunch irobotron_description move_base.launch
You may see a few warnings when the navigation stack starts. But if you do a rostopic list, you should see numerous topics in the "/move_base/" namespace. These are topics related to the navigation stack.
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees. Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff. Anyway, you can start it by typing:
rosrun rviz rviz
Once rviz is up and running, follow this tutorial to set it up to visualize all kinds of navigation stack related topics.
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:
cd to where you store your downloaded ROS packages svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard rosmake museum_guard
Once museum_guard is made, you can start it up by typing:
rosrun museum_guard smach_guard.py
One of the powerful features of smach is the smach_viewer utility that you can start as follows:
rosrun smach_viewer smach_viewer.py
You should see the two states illustrated in class, namely "GOTO_HALL_1" and "INSPECT_ARTIFACT_1" in the Smach Viewer window.
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:
rosrun image_view image_view image:=/stereo/left/image_rect
However, even better than this, you can run the face_detector covered in the last class. If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example. To get the one provided, simply do:
cd to where you store your downloaded ROS packages svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection rosmake face_detection
Assignment 2 discusses face_detection in more detail.
Once the face_detector is available, you can start it as follows:
roslaunch face_detection face_detector.launch
This will open a new image_view for the "/face_view" topic, which is the same as the normal image topic "/stereo/left/image_rect", but it also puts boxes around recognized faces. Also, when a face is detected, the face detector puts out a "/face_coords" topic that describes the point in the image plane of the face being tracked. Remember that you can monitor the "/face_coords" topic by simply opening a terminal and typing the following. However, note that the topic is generated only when a face is recognized:
rostopic echo /face_coords
This will be most important when trying to uncover any paranormal activity. For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the "/face_coords" topic will undoubtedly be changing as the artifact rotates. Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the "/face_coords" topic to determine whether there is paranormal activity in the area.
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system. For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating. To test your algorithm you'll most likely want to rotate a variety of artifacts to see how the robot responds. To start, here is an example of how to get the first artifact to rotate (very slowly):
rosservice call gazebo/apply_body_wrench '{reference_point: {x: -2, y: 0, z: 0}, body_name: "alan_model::alan_link", wrench: { torque: { x: 0, y: 0 , z: 0.01 } }, duration: 1000000000 }'