In this lab you will explore the Calliope's kinematic structure and
Tekkotsu's forward and inverse kinematics functions.
Part I: CameraTrackGripper Demo
- Do this step using Mirage. Go to Root Control > Framework
Demos > Kinematics Demos > CameraTrackGripper. Turn on the
RawCam viewer and the Arm Controller. Move the arm and note that the
left finger remains centered in the camera image. Type "msg" to the
Tekkotsu console a few times to display the finger-to-base
transformation matrix and observe how it changes as you move the arm.
- Now run CameraTrackGripper on the robot, but turn off the Arm
Controller. Instead, press the Play button on the Create and the
robot will say "Arm relaxed". (If it doesn't say this, press the
button more firmly; make sure the sound is turned up.) Now take hold
of the arm and move it around; the camera will follow.
- Notice that the motion of the head is jerky when the RawCam
viewer is active. What happens when you turn off the RawCam viewer?
Part II: GripperTrackCamera demo
- Do this step in Mirage. Run the GripperTrackCamera demo. Make
sure the Arm Controller is not running or it will fight with the demo.
Use the Head Controller to move the head and notice that the arm moves
appropriately to keep the left finger 100 mm from the camera.
- You should also notice that when the demo starts up, the arm
moves to its initial position at a very high rate of speed. This is
dangerous. The robot could hurt itself or a person. To experiment
with this, stop the behavior, use the Arm Controller to move the
gripper to a position far from the camera, shut down the Arm
Controller, and run the behavior again. Think about what could happen
if someone's face was close to the arm when the demo started up.
- Copy the demo to your project directory, and rename it
GripperTrackCamera2.cc.fsm. Also edit the class name where it
appears twice in the file: once in the $nodeclass and once in the
REGISTER_BEHAVIOR.
- We're going to make the demo safe by checking the distance that
the arm needs to travel to reach its target position. If the distance
is large, we'll use the setMaxSpeed method of PostureMC to slow down
the arm. Thus, on startup, the arm will move gently to its initial
position. Once there, it can move at high speed as long as it's only
making small corrections. Implement this change and test it in Mirage
to make sure the arm behaves safely.
- Once you're convinced your code is safe, test it on the robot.
Part III: Holding Hands
Write a behavior that allows you to lead the robot around by the hand.
- Start by moving the arm to a posture where the gripper is sticking
straight out. You can use a PostureNode for this, with the setOuputCmd method
to set joint angles directly rather than loading a posture file.
- In your state machine, leave the posture node active so it holds
the arm in this fixed position.
- Even with the posture node active, you can displace the arm
slightly by grasping it and gently pulling or pushing. This is how we
will guide the robot.
- Write code to calculate the position of the gripper frame and
notice when it is displaced from its intended position. For
displacements above a threshold, have the robot walk forward or back,
or turn left or right, for as long as the displacement persists. Make
the speed proportional to the magnitude of the displacement. You'll
want to use a WalkNode and the setTargetVelocity method for this,
rather than a PilotNode, so you can continually change the velocity in
a tight sensor loop, like the other kinematics demos we've examined in
this lab. You need to keep the WalkNode active, though; you can't
keep exiting and reentering it because that will not leave the walk
active enough to produce any motion. Instead you should define a
subclass of WalkNode that does:
virtual void doStart() {
subscribe to sensorEGID events
}
virtual void doEvent() {
read the gripper position and call setTargetVelocity to
change the walk's velocity while it's running
}
Part IV: Continuous Trajectories
- Write a behavior that moves the gripper to a point slightly in
front of the robot, and then moves it, at a low speed, to a point a
bit a good bit further ahead, i.e., farther in the x direction but
with the y and z coordinates the same. You can do this with a series
of two PostureNodes.
- Run your behavior, and look at the trajectory of the gripper.
Although the start and end points should have the same y and z values, the
gripper's height above the ground will not remain constant throughout
the trajectory. Why not?
- Let's try a different approach: we can divide the trajectory into
small steps and do a separate IK calculation for each step. This way
we can get the trajectory to approximate a straight line. Use a
DynamicMotionSequenceNode for this. Inside the doStart, create a
PostureEngine instance. You can call its solveLinkPosition method to
calculate a set of arm joint angles, and then call the
MotionSequenceEngine's setPos method to copy this posture into the
motion sequence. Do this in a loop as you interpolate the target
position from the start to the end point. When you've calculated the
complete motion sequence and returned from the doStart, the
DynamicMotionSequence node will play the sequence. What does the
trajectory look like?
Part V: Solving for Orientation
- In your ~/project directory, type "make tekkotsu-HANDEYE" to
build Tekkotsu for the Planar Hand/Eye system. Then run it using
Mirage and play with the arm controller. Notice that with the current
interface you can click and drag on the red dot to control the
location of the end-effector, but not its orientation.
- The PostureEngine class includes several IK solver methods. So
far we've only used solveLinkPosition. A more general method is
solveLink, which can solve for a position and orientation
simultaneously. Orientations are described using quaternions, which
are four-dimensional complex numbers, but you don't have to worry
about the math because there is a nice built-in function for computing
quaternions for rotating about the z-axis. The demo program LinkOrient.cc.fsm illustrates this.
Copy it to your ~/project directory, compile it, and run it. Note:
turn off the Arm Controller first, or the demo won't work.
- Create a behavior for the Hand/Eye robot to demonstrate control
of gripper orientation. Your behavior should start by tilting the
camera down and looking for a small ellipse in the Mirage world. Once
it knows the location of the ellipse, it should move the gripper to a
position 100 mm from the ellipse, with the fingers oriented toward the
center of the ellipse. If you then start up the Head Controller and
move the head pan joint back and forth, the gripper should rotate
around the ellipse, keeping the fingers pointed toward the ellipse
center.
- The arm won't be able to go all the way around the ellipse, but
180° of travel will be plenty. The ability of the arm to do this
will depend on the ellipse location; pick an initial location for the
ellipse that permits your demo to work.
- Demonstrate the generality of your approach by stopping your
behavior, starting the Arm Controller, and using the arm to move the
ellipse to a different location. Then stop the Arm Controller and run
your behavior again.
What to Hand In
Hand in the source code for each part.
Due Friday, March 15 (extra time because you have the Visual Search Project due on the 8th.)
|