Skip to content

Get Pose Transform

SUMMARY

Get Pose Transform applies a relative transformation (expressed in the source pose's own coordinate frame) to a source pose and returns the resulting pose in the world (base) frame.

SUPPORTED ROBOTS

This skill is currently supported on Universal Robots only. Calling this method on other robot brands will raise a NotImplementedError.

The Skill

python
result_pose = robot.get_pose_transform(source_pose=p_from, relative_transform=p_from_to)

The Code

Example: Apply a Relative Offset in the Tool Frame

Read the current TCP pose and compute a new pose shifted 10 cm along the local X axis.

python
from loguru import logger
from telekinesis.synapse.robots.manipulators import universal_robots

robot_ip = "192.168.1.2"  # replace with your robot's IP

robot = universal_robots.UniversalRobotsUR10E()
# Connect
robot.connect(ip=robot_ip)

p_from = robot.get_cartesian_pose()
logger.info(f"Current TCP pose: {p_from}")

p_from_to = [0.1, 0.0, 0.0, 0.0, 0.0, 0.0]
result = robot.get_pose_transform(source_pose=p_from, relative_transform=p_from_to)
logger.success(f"Transformed pose: {result}")

# Disconnect
robot.disconnect()

The Explanation of the Code

get_pose_transform delegates the computation to the robot controller. source_pose is the reference frame expressed as [x, y, z, rx, ry, rz] in meters and degrees. relative_transform is a pose offset expressed in the coordinate frame of source_pose. The returned pose is the result of applying relative_transform in the source_pose frame, expressed in the robot base frame. This is equivalent to: result = source_pose * relative_transform (as a homogeneous matrix product).

How to Tune the Parameters

ParameterTypeDefaultDescription
source_poselist[float]Reference pose [x, y, z, rx, ry, rz]. Position in meters, orientation in degrees.
relative_transformlist[float]Offset pose expressed in the source_pose frame. Position in meters, orientation in degrees.

Return Value

Return TypeDescription
list[float]Resulting pose [x, y, z, rx, ry, rz] in the robot base frame.

Where to Use the Skill

  • Computing an approach pose that is offset from a detected object pose along the tool axis.
  • Constructing Cartesian waypoints relative to a known reference frame.

When Not to Use the Skill

Use a math library for batch computations:

  • Use Python's numpy or scipy for batch pose chain computations that do not need to run through the robot controller.