ALMath

<< return to examples index

This section contains some examples showing how to use libalmath Python wrapping. The C++ documentation of this library can be found here. The wrapping allows to use all functions contained in this library, which is very useful for computations related with motions (effectors positions for example).

Python wrapping

libalmath is wrapped in Python. This makes it possible for example to use this library from Choregraphe, or from any Python script.

To import almath, use the following line (after having correctly configured the Python SDK path if you are coding outside Choregraphe, see Python SDK Install Guide ):

import almath

You can now call any method from libalmath with the following line:

almath.methodName(arg0, arg1, ...)

Since libalmath is using particular types, you have to be careful to use them correctly. This can cause some difficulties when interfacing with other modules, such as ALMotion, which do not give the right format directly.

Using ALMath with ALMotion

This example shows how to retrieve transforms from ALMotion using the ALMotionProxy::getTransform method, and how to send transforms computed with ALMath back to ALMotion using the ALMotionProxy::setTransform for example (but the principle is still the same for other methods using transforms).

almath_transform.py

''' Example showing how to use almath with python and send the results to
    the robot by using a proxy to ALMotion '''


import time

from naoqi import ALProxy

# Import almath
import almath

IP = "127.0.0.1" # set your Ip adress here
PORT = 9559

# Create a proxy to ALMotion.
try:
  motionProxy = ALProxy("ALMotion", IP, PORT)
except Exception,e:
  print "Could not create proxy to ALMotion"
  print "Error was: ",e

chainName = "RArm"
space = 1
useSensors = True

# Set stiffness on.
# motionProxy.stiffnessInterpolation("Body", 1.0, 0.5)

# Stand up.
motionProxy.walkInit()

##############################################
# Retrieve a transform matrix using ALMotion #
##############################################

# Retrieve current transform from ALMotion.
transformList = motionProxy.getTransform(chainName, space, useSensors)

# Convert it to vector format of ALMath.
transformVector = almath.vectorFloat(transformList)

# Convert it to a transform matrix for ALMath.
origTransform = almath.Transform(transformVector)

# Visualize the transform using overriden print from ALMath
print "Original transform"
print origTransform

##########################################################
# Use almath to do some computations on transform matrix #
##########################################################

# Compute a transform corresponding to the desired move
# (here, move the chain for 5cm along the Z axis and the X axis).
desiredPosition = almath.Position3D(0.05, 0.0, 0.05)
moveTransform = almath.transformFromPosition3D(desiredPosition)

# Compute the corresponding target transform.
targetTransform = moveTransform * origTransform

# Visualize it.
print "Target transform"
print targetTransform

##############################################
# Send a transform to the robot via ALMotion #
##############################################

# Convert it to a almath vector.
targetTransformVector = targetTransform.toVector()

# Convert it to a list.
targetTransformList = []
for val in targetTransformVector:
  targetTransformList.append(val)

# Send the target transform to NAO through ALMotion.
motionProxy.setTransform(chainName, space, targetTransformList, 0.5, 7)

time.sleep(2.0)

Using ALMath to generate footsteps

ALMath is widely used inside ALMotion. Using ALMath, you can then reproduce some of the features included in ALMath, in particular concerning footsteps. If you send any footstep to ALMotion, you will get a lot of warnings because they are probably dangerous for the robot: they might cause foot collision or be too long for NAO. If you want to generate footsteps without any warnings, you have to clip them, using ALMath functionnalities.

The following example allows you to lead NAO by its arm. It generates footsteps according to the left arm position, and then clips them to make sure they are possible for NAO.

To use the example, launch the script giving your NAO’s IP as an argument. NAO will stand up. When you are ready, take NAO’s left arm and press the front tactil sensor. You can now guide NAO by inclining its arm forward and backwards, and make him turn by turning his left wrist. NAO’s eyes will turn green when its arm position gives him a target, and blue when the arm position is neutral. To end the example, press the rear tactile sensor: NAO will crouch and remove its stiffness.

The code for the robot guide is the following: almath_robot_guide.py. Do not forget to put the code for the foot clipping in the same folder: almath_foot_clip.py.

''' This example shows how to guide NAO by the hand, while computing his
    moves with only footsteps, using ALMath library. Footstep clipping is
    described in almath_foot_clip.py. '''

import sys
import time
import math

from naoqi import ALProxy
import almath

import almath_foot_clip

armName = "LArm"
lFootOffset = almath.Pose2D(0.0, 0.09, 0.0)
rFootOffset = almath.Pose2D(0.0, -0.09, 0.0)
stepSpeed = 1.0
stepLength = 0.05

def initRobotPosition(motionProxy):
  ''' Inits NAO's position and stiffnesses to make the guiding possible.'''

  motionProxy.stiffnessInterpolation("Body", 1.0, 0.5)
  motionProxy.walkInit()
  time.sleep(1.0)
  # Make left arm loose.
  motionProxy.setAngles("LWristYaw", 0.0, 1.0, True)
  motionProxy.setAngles("Head", [0.44, -0.44], 0.5)
  motionProxy.setStiffnesses("LArm", 0.0)
  motionProxy.setStiffnesses("LWristYaw", 0.2)

  # Disable arm moves while walking on left arm.
  motionProxy.setWalkArmsEnable(False, True)
  time.sleep(1.0)



def interpretJointsPose(motionProxy, memoryProxy):
  ''' Translates the current left arm pose into a target position for NAO's
      foot. '''

  # Retrieve current arm position.
  armPose = motionProxy.getAngles(armName, True)

  targetX = 0.0
  targetY = 0.0
  targetTheta = 0.0
  gaitConfig = motionProxy.getFootGaitConfig("Default")

  # Filter Shoulder Pitch.
  if (armPose[0] > - 0.9 and armPose[0] < -0.20):
    targetX = stepLength
  elif (armPose[0] > -2.5 and armPose[0] < -1.5):
    targetX = - stepLength - 0.02

  # Filter Wrist Yaw.
  if (armPose[4] > 0.2):
    targetTheta = gaitConfig[2][1]
  elif (armPose[4] < -0.2):
    targetTheta = - gaitConfig[2][1]

  # Return corresponding pose.
  return almath.Pose2D(targetX, targetY, targetTheta)


def crouch(motionProxy):
  ''' Makes NAO crouch. '''
  names = list()
  times = list()
  keys = list()

  names.append("HeadYaw")
  times.append([ 1.00000])
  keys.append([ -0.03379])

  names.append("HeadPitch")
  times.append([ 1.00000])
  keys.append([ 0.32823])

  names.append("LShoulderPitch")
  times.append([ 1.00000])
  keys.append([ 1.62600])

  names.append("LShoulderRoll")
  times.append([ 1.00000])
  keys.append([ -0.01998])

  names.append("LElbowYaw")
  times.append([ 1.00000])
  keys.append([ -0.98640])

  names.append("LElbowRoll")
  times.append([ 1.00000])
  keys.append([ -1.24250])

  names.append("LWristYaw")
  times.append([ 1.00000])
  keys.append([ 0.08433])

  names.append("LHand")
  times.append([ 1.00000])
  keys.append([ 0.00015])

  names.append("RShoulderPitch")
  times.append([ 1.00000])
  keys.append([ 1.51870])

  names.append("RShoulderRoll")
  times.append([ 1.00000])
  keys.append([ -0.05680])

  names.append("RElbowYaw")
  times.append([ 1.00000])
  keys.append([ 0.72554])

  names.append("RElbowRoll")
  times.append([ 1.00000])
  keys.append([ 1.26099])

  names.append("RWristYaw")
  times.append([ 1.00000])
  keys.append([ 0.72094])

  names.append("RHand")
  times.append([ 1.00000])
  keys.append([ 0.00018])

  names.append("LHipYawPitch")
  times.append([ 1.00000])
  keys.append([ 0.01538])

  names.append("LHipRoll")
  times.append([ 1.00000])
  keys.append([ -0.00456])

  names.append("LHipPitch")
  times.append([ 1.00000])
  keys.append([ -0.90962])

  names.append("LKneePitch")
  times.append([ 1.00000])
  keys.append([ 2.11255])

  names.append("LAnklePitch")
  times.append([ 1.00000])
  keys.append([ -1.18889])

  names.append("LAnkleRoll")
  times.append([ 1.00000])
  keys.append([ 0.00464])

  names.append("RHipRoll")
  times.append([ 1.00000])
  keys.append([ 0.10589])

  names.append("RHipPitch")
  times.append([ 1.00000])
  keys.append([ -0.90203])

  names.append("RKneePitch")
  times.append([ 1.00000])
  keys.append([ 2.11255])

  names.append("RAnklePitch")
  times.append([ 1.00000])
  keys.append([ -1.18630])

  names.append("RAnkleRoll")
  times.append([ 1.00000])
  keys.append([ -0.10427])

  motionProxy.angleInterpolation(names, keys, times, True)



def moveToTargetPose(targetPose, motionProxy, isLeftSupport):
  ''' Move to the desired target with the current foot. '''

  targetTf = almath.transformFromPose2D(targetPose)

  # Compute foot position with the offset in NAOSpace.
  if (isLeftSupport):
    footTargetTf = targetTf * almath.transformFromPose2D(rFootOffset)
    footTargetPose = almath.pose2DFromTransform(footTargetTf)
    name = ["RLeg"]
  else:
    footTargetTf = targetTf * almath.transformFromPose2D(lFootOffset)
    footTargetPose = almath.pose2DFromTransform(footTargetTf)
    name = ["LLeg"]

  # Clip the footstep to avoid collisions and too wide steps.
  almath_foot_clip.clipFootStep(footTargetPose, isLeftSupport)

  step = [[footTargetPose.x, footTargetPose.y, footTargetPose.theta]]
  speed = [stepSpeed]

  # Send the footstep to NAO.
  motionProxy.setFootStepsWithSpeed(name, step, speed, False)

  # Change current foot.
  isLeftSupport = not isLeftSupport

def main(robotIP):

  # Init proxies.
  motionProxy = ALProxy("ALMotion", robotIP, 9559)
  memoryProxy = ALProxy("ALMemory", robotIP, 9559)
  ledsProxy = ALProxy("ALLeds", robotIP, 9559)

  # Init robot position.
  initRobotPosition(motionProxy)

  # Wait for the user to press the front tactil sensor.
  while (not memoryProxy.getData("FrontTactilTouched")):
    pass

  # Start by moving left foot.
  isLeftSupport = False
  isMoving = False
  ledsProxy.fadeRGB("FaceLeds", 255, 0.1)

  while (not memoryProxy.getData("RearTactilTouched")):
    targetPose = interpretJointsPose(motionProxy, memoryProxy)

    # Filter the pose to avoid too small steps.
    if (math.fabs(targetPose.x) > 0.01 or math.fabs(targetPose.y) > 0.01 or math.fabs(targetPose.theta) > 0.08):
      moveToTargetPose(targetPose, motionProxy, isLeftSupport)
      isLeftSupport = not isLeftSupport
      isMoving = True
      # Set LEDs to green.
      ledsProxy.fadeRGB("FaceLeds", 256 * 255, 0.1)

    elif (isMoving):
      # Stop the robot.
      motionProxy.stopWalk()
      isMoving = False
      # Set LEDs to blue.
      ledsProxy.fadeRGB("FaceLeds", 255, 0.1)

  # Set LEDs to white.
  ledsProxy.fadeRGB("FaceLeds", 256 * 256 * 255 + 256 * 255 + 255, 0.1)

  # Crouch.
  motionProxy.stiffnessInterpolation("Body", 1.0, 0.5)
  crouch(motionProxy)
  time.sleep(1.0)
  # Set stiffness off.
  motionProxy.stiffnessInterpolation("Body", 0.0, 0.5)


if __name__ == "__main__":
    if (len(sys.argv) < 2):
      print "Usage python almath_robot_guide.py robotIP"
      sys.exit(1)

    main(sys.argv[1])

The code for the foot clipping is the following:

''' This file shows how to use ALMath library in Python to clip
    NAO footsteps to correct moves (avoiding too big steps and collision).
    Use clipFootStep to clip your desired pose.
'''

import almath

# Various parameters describing step characteristics.
minFootSeparation = 0.088
minStepX = - 0.04
maxStepX = 0.08
maxStepY = 0.16
maxStepTheta = 0.35

# Bounding boxes of NAO's feet.
rFootBoxFL = almath.Pose2D(0.11, 0.038, 0.0)
rFootBoxFR = almath.Pose2D(0.11, -0.050, 0.0)
rFootBoxRR = almath.Pose2D(-0.047, -0.050, 0.0)
rFootBoxRL = almath.Pose2D(-0.047, 0.038, 0.0)
rFootBox = almath.vectorPose2D([rFootBoxFL, rFootBoxFR, rFootBoxRR, rFootBoxRL])

lFootBoxFL = almath.Pose2D(0.11, 0.050, 0.0)
lFootBoxFR = almath.Pose2D(0.11, -0.038, 0.0)
lFootBoxRR = almath.Pose2D(-0.047, -0.038, 0.0)
lFootBoxRL = almath.Pose2D(-0.047,  0.050, 0.0)
lFootBox = almath.vectorPose2D([lFootBoxFL, lFootBoxFR, lFootBoxRR, lFootBoxRL])


def clipData(minValue, maxValue, value):
  ''' Clip value between two extremes. '''
  clipped = value

  if (clipped < minValue):
    clipped = minValue
  if (clipped > maxValue):
    clipped = maxValue

  return clipped


def clipFootStepOnGaitConfig(footMove, isLeftSupport):
  ''' Clip the foot move so that it does not exceed the maximum
      size of steps.
      footMove is an almath.Pose2D (x, y, theta position).
      isLeftSupport must be set to True if the move is on the right leg
      (the robot is supporting itself on the left leg).
  '''

  # Clip X.
  clippedX = clipData(minStepX, maxStepX, footMove.x)
  footMove.x = clippedX

  # Clip Y.
  if not isLeftSupport:
    clippedY = clipData(minFootSeparation, maxStepY, footMove.y)
  else:
    clippedY = clipData(-maxStepY, - minFootSeparation, footMove.y)
  footMove.y = clippedY


  # Clip Theta.
  clippedTheta = clipData(-maxStepTheta, maxStepTheta, footMove.theta)
  footMove.theta = clippedTheta


def clipFootStepWithEllipse(footMove):
  ''' Clip the foot move inside an ellipse defined by the foot's dimansions.
      footMove is an almath.Pose2D (x, y, theta position).
  '''

  # Apply an offset to have Y component of foot move centered on 0.
  if (footMove.y < -minFootSeparation):
    footMove.y = footMove.y + minFootSeparation
  elif (footMove.y > minFootSeparation):
    footMove.y = footMove.y - minFootSeparation
  else:
    return

  # Clip the foot move to an ellipse using ALMath method.
  if footMove.x >= 0:
    almath.clipFootWithEllipse(maxStepX, maxStepY - minFootSeparation, footMove)
  else:
    almath.clipFootWithEllipse(minStepX, maxStepY - minFootSeparation, footMove)

  # Correct the previous offset on Y component.
  if footMove.y >=0:
    footMove.y  = footMove.y + minFootSeparation
  else:
    footMove.y = footMove.y - minFootSeparation


def clipFootStepToAvoidCollision(footMove, isLeftSupport):
  ''' Clip the foot move to avoid collision with the other foot.
      footMove is an almath.Pose2D (x, y, theta position).
      isLeftSupport must be set to True if the move is on the right leg
      (the robot is supporting itself on the left leg).
  '''

  # Use ALMath method.
  almath.avoidFootCollision(lFootBox, rFootBox, isLeftSupport, footMove)

def clipFootStep(footMove, isLeftSupport):
  ''' Clipping functions to avoid any warnings and undesired effects
      when sending the footsteps to ALMotion.
      footMove is an almath.Pose2D (x, y, theta position)
      isLeftSupport must be set to True if the move is on the right leg
      (the robot is supporting itself on the left leg).
  '''

  # First clip the foot move with gait config.
  clipFootStepOnGaitConfig(footMove, isLeftSupport)
  # Then clip it on an ellipse.
  clipFootStepWithEllipse(footMove)
  # Then make sure you do not have any collision.
  clipFootStepToAvoidCollision(footMove, isLeftSupport)