15-494/694 Cognitive Robotics: Lab 4
Learning Goal: This lab will introduce you to the Tekkotsu
Pilot and use of a particle filter for localization.
Part 1: TagCourse Demo
- Read the wiki page on the TagCourse Maze.
- Start Mirage using TagCourse.mirage and explore the world.
Notice the green dot at the origin, which is the robot's starting
point, and the red dot that mark's the intended stopping point.
- Run Tekkotsu in Mirage.
- Park the simulated robot's arm by doing to Root Control > File
Access > Load Posture > calliope2sp and loading park1.pos, then
park2.pos.
- Run Root Control > Framework Demos > Navigation >
TagCourse. Display the world map and watch how the robot adds
AprilTags to the map as it goes. Also notice how the particle cloud
slowly disperses as the robot moves around the world.
- Study the source code at
/usr/local/Tekkotsu/Behaviors/Demos/Navigation/TagCourse.cc.fsm
to understand how the behavior is implemented.
- TagCourse is built on top of PilotDemo, so you can use
PilotDemo commands to control the robot. Now that the robot has found
all the AprilTags, let's use the tags as landmarks to demonstrate
localization. Take screenshots as you go.
- Type "msg rand" to randomize the particles, and refresh the world map.
- Drive the robot to a new position somewhere within the course by
using PilotDemo commands such as "F" to go forward and "L" to turn
left. Refresh the world map as the robot moves. Note that both the
robot's position and the particles' positions continue to update via
odometry.
- Use the head controller to move the camera so that no AprilTags
are visible.
- Type "msg loc" to tell the Pilot to localize. The resulting
particle cloud may still be somewhat diffuse. Type "msg loc" again
and see what happens.
- The Pilot needs to see at least two landmarks to localize, but it
can get better results if it sees more landmarks. Use the head
controller to point the camera at a different pair of AprilTags and
try "msg loc" again. What is the effect on the particle cloud?
Part 2: Improvements to TagCourse
- If the robot gets slightly off course, either due to inaccurate
motion or a minor collision, it might see more than two AprilTags the
next time it examines its camera image. The tags it's "supposed" to
see will be near the center of the image, but there may be other tags
visible at the edges as a result of the heading being off. Currently,
the TagCourse behavior refuses to move if there aren't exactly two
tags visible.
- Verify this behavior in Mirage. Start with the robot at the
origin (press "r" in the Mirage window to reload). Run Root Control
> Framework Demos > Navigation > PilotDemo and give the
command "msg l" to turn 10 degrees to the left. Then run the
TagCourse demo. Examine the representations in both the world and
camera shape spaces.
- Modify TagCourse to make it more robust. It should not be upset
by tags at the edges of the camera image if it can see two good tags
near the center of the image. Note: to test your modified code you
must change the name of the behavior and edit the REGISTER_BEHAVIOR
call so that you don't clash with the built-in TagCourse demo.
Part 3: Navigation on the Real Robot
- Implement a navigation task where the robot looks for some
landmarks (use the red and green cylinders) and then uses them to
define a destination. Specifically, if it sees a red cylinder ahead
and to the left, and a green cylinder ahead and to the right, it can
calculate an angle to turn and a distance to travel in order to put
itself on the line joining the two cylinders, equidistant from both.
It should use the Pilot to execute this two-step trajectory. Use the
cylinder shape, not the blob shape, so that the MapBuilder can
calculate distance information (it assumes the cylinder is standing on
the ground plane.)
- Note: to see the green cylinder you may need to change the camera
settings. Specifically, try using domino.plist but also, using the
Tekkotsu command line, do "set
Drivers.Camera.Options.Saturation=255". To reset the camera to its
default settings you will have to unplug it and then plug it in again.
- Now have the robot head away from the cylinders, and then turn
around so it can see them both. Print out its estimate position.
Then have the Pilot localize (it will use the cylinders as landmarks),
and print out the updated position.
What to Hand In
Finish the lab for homework.
- For Part 1, hand in your screen shots.
- For Part 2, hand in your modified TagCourse source code.
- For Part 3, hand in your source code and a screenshot showing
the output (robot's position estimate before and after localization).
Due Friday, February 13, 2015.
|