ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Here my solution. Because I had to restructure the gripper (another story), I couldn't even get one of the two finger links through the /gazebo/link_states topic, only another link which belonges to the gripper (arm_wrist_flex_link). With this position and the tf-tree transformation from the arm_wrist_flex_link to the gripper_active_link, I could calculate the position of the right gripper finger link (gripper_active_link) in the /world coordinate system. I then calculated the unit vectors of the PickMeUp object (described in the /world coordiante system). With these vectors, I was able to define a cuboid in the /world coordinate system, where the right gripper finger has to be in order to pick up the object. Finally I check if the position of the gripper is within this bounding box. If yes, the gripper is in the correct position.

This is how I ended up implementing (shortened):

object_positions = rospy.wait_for_message('/gazebo/link_states', LinkStates, timeout=5) # array of poses (= position & orientation of some objects/links)
indices = [i for i, s in enumerate(object_positions.name) if 'pickMeUp' in s]
indexOfPickMeUp = indices[0]
pickup_pose = object_positions.pose[indexOfPickMeUp]
pickup_position = pickup_pose.position

# We can only get the arm_wrist_flex_link via the /gazebo/link_states topic, so we get the pose of this link (which doesn't move relative to the finger links) and use this arm_wrist_flex_link pose to calculate the position of the two finger links
indices = [i for i, s in enumerate(object_positions.name) if 'arm_wrist_flex_link' in s]
indexOfGripper = indices[0]
arm_wrist_pose = object_positions.pose[indexOfGripper]
# Getting translations from arm_wrist_flex_link to gripper_active link (we don't care for rotation, as we are only interested in the position of the gripper_active link (and not its orientation as well)). As mentioned above, these two coordinate systems don't move relative to each other.
listener = tf.TransformListener()
try:
   listener.waitForTransform("/arm_wrist_flex_link", "/gripper_active_link", rospy.Time(), rospy.Duration(4.0))
   (trans, rot) = listener.lookupTransform("/arm_wrist_flex_link", "/gripper_active_link", rospy.Time())
except:
   pass
# calculating the unit vectors (described in the /world coordinate system) of the arm_wrist_flex_link
quaternion = arm_wrist_pose.orientation
explicit_quat = [quaternion.x, quaternion.y, quaternion.z, quaternion.w]
roll, pitch, yaw = tf.transformations.euler_from_quaternion(explicit_quat)
rot_mat = tf.transformations.euler_matrix(roll, pitch, yaw)
ex = np.dot(rot_mat, np.matrix([[1], [0], [0], [1]]))
ex = ex[:3] / ex[3]
ey = np.dot(rot_mat, np.matrix([[0], [1], [0], [1]]))
ey = ey[:3] / ey[3]
ez = np.dot(rot_mat, np.matrix([[0], [0], [1], [1]]))
ez = ez[:3] / ez[3]

# calculating the position of gripper_active_link (gripper_right_position, described in the world coordinate system)
gripper_right_position = np.matrix([[arm_wrist_pose.position.x],[arm_wrist_pose.position.y],[arm_wrist_pose.position.z]])
gripper_right_position = gripper_right_position + trans[0]*ex + trans[1]*ey + trans[2]*ez print(gripper_right_position) # e.g. [[0.09] [0.013] [0.058]]

# calculating the unit vectors (described in the /world coordinate system) of the pickMeUp::link
quaternion = pickup_pose.orientation
explicit_quat = [quaternion.x, quaternion.y, quaternion.z, quaternion.w]
roll, pitch, yaw = tf.transformations.euler_from_quaternion(explicit_quat)
rot_mat = tf.transformations.euler_matrix(roll, pitch, yaw)
ex = np.dot(rot_mat, np.matrix([[1], [0], [0], [1]]))
ex = ex[:3] / ex[3]
ey = np.dot(rot_mat, np.matrix([[0], [1], [0], [1]]))
ey = ey[:3] / ey[3]
ez = np.dot(rot_mat, np.matrix([[0], [0], [1], [1]]))
ez = ez[:3] / ez[3]

# corners of bounding box where gripper_right_position has to be
p1 = pickup_position + pickup_xdim/2 * ex + pickup_ydim/2 * ey
p2 = p1 - pickup_xdim * ex
p4 = p1 + ey * (pickup_ydim/2 + gripper_width - pickup_ydim)
p5 = p1 + ez * pickup_zdim

# the vectors of the bounding box where gripper_right_position has to be
# transpose is necessary because np.dot can't handle two (3,1) matrices, so one of them has to be a (1,3) matrix
u = np.transpose(p2 - p1)
v = np.transpose(p4 - p1)
w = np.transpose(p5 - p1)

# checking if gripper_right_position is in the correct location (i.e. within the cuboid described by u, v & w) (see https://math.stackexchange.com/questions/1472049/check-if-a-point-is-inside-a-rectangular-shaped-area-3d)
if(np.dot(u, gripper_right_position) > np.dot(u, p1) and np.dot(u, gripper_right_position ) < np.dot(u, p2)):
    print("gripper is in correct position (x-axis)")
    if(np.dot(v, gripper_right_position ) > np.dot(v, p1) and np.dot(v, gripper_right_position ) < np.dot(v, p4)):
        print("gripper is in correct position (y-axis)")
        if(np.dot(w, gripper_right_position ) > np.dot(w, p1) and np.dot(w, gripper_right_position ) < np.dot(w, p5)):
            print("gripper is in correct position (z-axis) --> grasping would be successful")
print("grasping would fail")

Here my solution. Because I had to restructure the gripper (another story), I couldn't even get one of the two finger links through the /gazebo/link_states topic, only another link which belonges to the gripper (arm_wrist_flex_link). With this position and the tf-tree transformation from the arm_wrist_flex_link to the gripper_active_link, I could calculate the position of the right gripper finger link (gripper_active_link) in the /world coordinate system. I then calculated the unit vectors of the PickMeUp object (described in the /world coordiante system). With these vectors, I was able to define a cuboid in the /world coordinate system, where the right gripper finger has to be in order to pick up the object. Finally I check if the position of the gripper is within this bounding box. If yes, the gripper is in the correct position.

This is how I ended up implementing (shortened):

object_positions = rospy.wait_for_message('/gazebo/link_states', LinkStates, timeout=5) # array of poses (= position & orientation of some objects/links)
indices = [i for i, s in enumerate(object_positions.name) if 'pickMeUp' in s]
indexOfPickMeUp = indices[0]
pickup_pose = object_positions.pose[indexOfPickMeUp]
pickup_position = pickup_pose.position

# We can only get the arm_wrist_flex_link via the /gazebo/link_states topic, so we get the pose of this link (which doesn't move relative to the finger links) and use this arm_wrist_flex_link pose to calculate the position of the two finger links
indices = [i for i, s in enumerate(object_positions.name) if 'arm_wrist_flex_link' in s]
indexOfGripper = indices[0]
arm_wrist_pose = object_positions.pose[indexOfGripper]
# Getting translations from arm_wrist_flex_link to gripper_active link (we don't care for rotation, as we are only interested in the position of the gripper_active link (and not its orientation as well)). As mentioned above, these two coordinate systems don't move relative to each other.
listener = tf.TransformListener()
try:
   listener.waitForTransform("/arm_wrist_flex_link", "/gripper_active_link", rospy.Time(), rospy.Duration(4.0))
   (trans, rot) = listener.lookupTransform("/arm_wrist_flex_link", "/gripper_active_link", rospy.Time())
except:
   pass
# calculating the unit vectors (described in the /world coordinate system) of the arm_wrist_flex_link
quaternion = arm_wrist_pose.orientation
explicit_quat = [quaternion.x, quaternion.y, quaternion.z, quaternion.w]
roll, pitch, yaw = tf.transformations.euler_from_quaternion(explicit_quat)
rot_mat = tf.transformations.euler_matrix(roll, pitch, yaw)
ex = np.dot(rot_mat, np.matrix([[1], [0], [0], [1]]))
ex = ex[:3] / ex[3]
ey = np.dot(rot_mat, np.matrix([[0], [1], [0], [1]]))
ey = ey[:3] / ey[3]
ez = np.dot(rot_mat, np.matrix([[0], [0], [1], [1]]))
ez = ez[:3] / ez[3]

# calculating the position of gripper_active_link (gripper_right_position, described in the world coordinate system)
gripper_right_position = np.matrix([[arm_wrist_pose.position.x],[arm_wrist_pose.position.y],[arm_wrist_pose.position.z]])
gripper_right_position = gripper_right_position + trans[0]*ex + trans[1]*ey + trans[2]*ez trans[2]*ez
print(gripper_right_position) # e.g. [[0.09] [0.013] [0.058]]

# calculating the unit vectors (described in the /world coordinate system) of the pickMeUp::link
quaternion = pickup_pose.orientation
explicit_quat = [quaternion.x, quaternion.y, quaternion.z, quaternion.w]
roll, pitch, yaw = tf.transformations.euler_from_quaternion(explicit_quat)
rot_mat = tf.transformations.euler_matrix(roll, pitch, yaw)
ex = np.dot(rot_mat, np.matrix([[1], [0], [0], [1]]))
ex = ex[:3] / ex[3]
ey = np.dot(rot_mat, np.matrix([[0], [1], [0], [1]]))
ey = ey[:3] / ey[3]
ez = np.dot(rot_mat, np.matrix([[0], [0], [1], [1]]))
ez = ez[:3] / ez[3]

# corners of bounding box where gripper_right_position has to be
p1 = pickup_position + pickup_xdim/2 * ex + pickup_ydim/2 * ey
ey  # e.g. [[0.135] [0.01 ] [0.04]]
p2 = p1 - pickup_xdim * ex
ex                                      # e.g. [[0.065] [-0.01 ] [0.04]]
p4 = p1 + ey * (pickup_ydim/2 + gripper_width - pickup_ydim)
pickup_ydim)    # e.g. [[0.065] [-0.01 ] [0.04]]
p5 = p1 + ez * pickup_zdim
pickup_zdim                                      # e.g. [[0.135] [-0.01 ] [0.07]]

# the vectors of the bounding box where gripper_right_position has to be
# transpose is necessary because np.dot can't handle two (3,1) matrices, so one of them has to be a (1,3) matrix
u = np.transpose(p2 - p1)
v = np.transpose(p4 - p1)
w = np.transpose(p5 - p1)

# checking if gripper_right_position is in the correct location (i.e. within the cuboid described by u, v & w) (see https://math.stackexchange.com/questions/1472049/check-if-a-point-is-inside-a-rectangular-shaped-area-3d)
if(np.dot(u, gripper_right_position) > np.dot(u, p1) and np.dot(u, gripper_right_position ) < np.dot(u, p2)):
    print("gripper is in correct position (x-axis)")
    if(np.dot(v, gripper_right_position ) > np.dot(v, p1) and np.dot(v, gripper_right_position ) < np.dot(v, p4)):
        print("gripper is in correct position (y-axis)")
        if(np.dot(w, gripper_right_position ) > np.dot(w, p1) and np.dot(w, gripper_right_position ) < np.dot(w, p5)):
            print("gripper is in correct position (z-axis) --> grasping would be successful")
print("grasping would fail")

Here my solution. Because I had to restructure the gripper (another story), I couldn't even get one of the two finger links through the /gazebo/link_states topic, only another link which belonges to the gripper (arm_wrist_flex_link). With this position and the tf-tree transformation from the arm_wrist_flex_link to the gripper_active_link, I could calculate the position of the right gripper finger link (gripper_active_link) in the /world coordinate system. I then calculated the unit vectors of the PickMeUp object (described in the /world coordiante system). With these vectors, I was able to define a cuboid in the /world coordinate system, where the right gripper finger has to be in order to pick up the object. Finally I check if the position of the gripper is within this bounding box. If yes, the gripper is in the correct position.

This is how I ended up implementing (shortened):

object_positions = rospy.wait_for_message('/gazebo/link_states', LinkStates, timeout=5) # array of poses (= position & orientation of some objects/links)
indices = [i for i, s in enumerate(object_positions.name) if 'pickMeUp' in s]
indexOfPickMeUp = indices[0]
pickup_pose = object_positions.pose[indexOfPickMeUp]
pickup_position = pickup_pose.position

# We can only get the arm_wrist_flex_link via the /gazebo/link_states topic, so we get the pose of this link (which doesn't move relative to the finger links) and use this arm_wrist_flex_link pose to calculate the position of the two finger links
indices = [i for i, s in enumerate(object_positions.name) if 'arm_wrist_flex_link' in s]
indexOfGripper = indices[0]
arm_wrist_pose = object_positions.pose[indexOfGripper]
# Getting translations from arm_wrist_flex_link to gripper_active link (we don't care for rotation, as we are only interested in the position of the gripper_active link (and not its orientation as well)). As mentioned above, these two coordinate systems don't move relative to each other.
listener = tf.TransformListener()
try:
   listener.waitForTransform("/arm_wrist_flex_link", "/gripper_active_link", rospy.Time(), rospy.Duration(4.0))
   (trans, rot) = listener.lookupTransform("/arm_wrist_flex_link", "/gripper_active_link", rospy.Time())
except:
   pass
# calculating the unit vectors (described in the /world coordinate system) of the arm_wrist_flex_link
quaternion = arm_wrist_pose.orientation
explicit_quat = [quaternion.x, quaternion.y, quaternion.z, quaternion.w]
roll, pitch, yaw = tf.transformations.euler_from_quaternion(explicit_quat)
rot_mat = tf.transformations.euler_matrix(roll, pitch, yaw)
ex = np.dot(rot_mat, np.matrix([[1], [0], [0], [1]]))
ex = ex[:3] / ex[3]
ey = np.dot(rot_mat, np.matrix([[0], [1], [0], [1]]))
ey = ey[:3] / ey[3]
ez = np.dot(rot_mat, np.matrix([[0], [0], [1], [1]]))
ez = ez[:3] / ez[3]

# calculating the position of gripper_active_link (gripper_right_position, described in the world coordinate system)
gripper_right_position = np.matrix([[arm_wrist_pose.position.x],[arm_wrist_pose.position.y],[arm_wrist_pose.position.z]])
gripper_right_position = gripper_right_position + trans[0]*ex + trans[1]*ey + trans[2]*ez
print(gripper_right_position) # e.g. [[0.09] [0.013] [0.058]]

# calculating the unit vectors (described in the /world coordinate system) of the pickMeUp::link
quaternion = pickup_pose.orientation
explicit_quat = [quaternion.x, quaternion.y, quaternion.z, quaternion.w]
roll, pitch, yaw = tf.transformations.euler_from_quaternion(explicit_quat)
rot_mat = tf.transformations.euler_matrix(roll, pitch, yaw)
ex = np.dot(rot_mat, np.matrix([[1], [0], [0], [1]]))
ex = ex[:3] / ex[3]
ey = np.dot(rot_mat, np.matrix([[0], [1], [0], [1]]))
ey = ey[:3] / ey[3]
ez = np.dot(rot_mat, np.matrix([[0], [0], [1], [1]]))
ez = ez[:3] / ez[3]

# corners of bounding box where gripper_right_position has to be
p1 = pickup_position + pickup_xdim/2 * ex + pickup_ydim/2 * ey  ey
# e.g. [[0.135] [0.01 ] [0.04]]
p2 = p1 - pickup_xdim * ex                                      ex
# e.g. [[0.065] [-0.01 [0.01 ] [0.04]]
p4 = p1 + ey * (pickup_ydim/2 + gripper_width - pickup_ydim)    pickup_ydim)
# e.g. [[0.065] [-0.01 ] [[0.135] [0.032] [0.04]]
p5 = p1 + ez * pickup_zdim                                      pickup_zdim
# e.g. [[0.135] [-0.01 [0.01 ] [0.07]]

# the vectors of the bounding box where gripper_right_position has to be
# transpose is necessary because np.dot can't handle two (3,1) matrices, so one of them has to be a (1,3) matrix
u = np.transpose(p2 - p1)
v = np.transpose(p4 - p1)
w = np.transpose(p5 - p1)

# checking if gripper_right_position is in the correct location (i.e. within the cuboid described by u, v & w) (see https://math.stackexchange.com/questions/1472049/check-if-a-point-is-inside-a-rectangular-shaped-area-3d)
if(np.dot(u, gripper_right_position) > np.dot(u, p1) and np.dot(u, gripper_right_position ) < np.dot(u, p2)):
    print("gripper is in correct position (x-axis)")
    if(np.dot(v, gripper_right_position ) > np.dot(v, p1) and np.dot(v, gripper_right_position ) < np.dot(v, p4)):
        print("gripper is in correct position (y-axis)")
        if(np.dot(w, gripper_right_position ) > np.dot(w, p1) and np.dot(w, gripper_right_position ) < np.dot(w, p5)):
            print("gripper is in correct position (z-axis) --> grasping would be successful")
print("grasping would fail")