ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
1

MoveIt! move_group_python_interface_tutorial.py fails to produce cartesian path

asked 2023-05-15 06:48:11 -0500

Jaú Gretler gravatar image

updated 2023-05-16 08:04:31 -0500

Mike Scheutzow gravatar image

Hi

I have the following setup:

  • OS: Ubuntu 20.04

  • Ros Distro: Noetic

One of my ultimate goals for the robot is to have its end-effector move through a series of waypoints using the blend_radius feature of the sequence capability provided by the Pilz industrial planner MoveIt! plugin

https://ros-planning.github.io/moveit...

To see how movement through waypoints look without the blending and to get familiar with the move_group_node I started following this tutorial:

https://ros-planning.github.io/moveit...

I am currently altering the move_group_python_interface_tutorial.py code to fit it to my robot.

Planning to a joint_goal or a pose_goal works fine with the above mentioned python script, at least it shows the desired robot movements in Rviz ( I have not tested it on the real robot yet but that is irrelevant for my question here).

I prepared 2 coordinates (w1,w2) and checked whether it was possible for MoveIt to find a trajectory for (w0,w1) and (w0,w2) with w0 being the starting position of the end-effector for the desired planning group. In both cases MoveIt planned and displayed a correct trajectory in rviz.

This led me to believe that it would also be possible to create an array with waypoints [w0,w1,w2] and to plan to a cartesian path going through these waypoints. But when I ask the move_group node to plan this cartesian path, it does not find one.

Here is the relevant code snippet which specifies the waypoints and tries to find a cartesian path:

 # Create waypoints 
 current_pose = tutorial.move_group.get_current_pose().pose
 w0 = copy.deepcopy(current_pose)
 w1 = copy.deepcopy(w0)
 w1.position.z += 0.05 
 w1.position.y += 0.07 
 w2 = copy.deepcopy(w0)
 w2.position.z += 0.09 
 w2.position.y += 0.08 
 w3 = copy.deepcopy(current_pose)
 waypoints = [w0,w1,w2]

 #plan 
 (plan, fraction) = tutorial.move_group.compute_cartesian_path(waypoints, 0.0005, 0.0)
 #display 
 display_trajectory = moveit_msgs.msg.DisplayTrajectory()
 display_trajectory.trajectory_start = tutorial.robot.get_current_state()
 display_trajectory.trajectory.append(plan)
 #publish
 tutorial.display_trajectory_publisher.publish(display_trajectory)
 #execute 
 tutorial.move_group.execute(plan, wait=True)

This snippet is an adapted version of the code from move_group_python_interface_tutorial.py using the exact same functions in the exact same order.

When the original move_group_python_interface_tutorial.py is executed with the panda robot it says in the terminal:

[ INFO] [1684139671.501745044]: Attempting to follow 3 waypoints for link 'panda_link8' using a step of 0.010000 m and jump threshold 0.000000 (in global reference frame)

[ INFO] [1684139671.507262405]: Computed Cartesian path with 44 points (followed 100.000000% of requested trajectory)

When I run move_group_python_interface_tutorial.py with the only modification being that my robots end-effector, planning group and lower step are used, then the terminal says:

[ INFO] [1684140333.395828072]: Attempting to follow 3 waypoints for link 'L_Hand_Link' using a step of 0.001000 m and jump threshold 0.000000 (in global reference frame)

[ INFO] [1684140333.401456556]: Computed Cartesian path with 1 points (followed 0.000000% of requested trajectory)

And in Rviz there is no movement, just a constant flickering of the robot (but only if ... (more)

edit retag flag offensive close merge delete

Comments

The notification "This post is awaiting moderation, please do not repost the same content!" appeared above my post; what exactly did I do wrong? Did someone else already post about this problem?

Jaú Gretler gravatar image Jaú Gretler  ( 2023-05-15 08:10:02 -0500 )edit

The notification "This post is awaiting moderation, please do not repost the same content!" appeared above my post; what exactly did I do wrong?

Spam is a huge problem for this site. New users who's post includes urls have to wait for a human to verify the post is not spam. This rule applies to everyone.

Mike Scheutzow gravatar image Mike Scheutzow  ( 2023-05-16 08:08:22 -0500 )edit

Thanks for the info and for the beautification of my post!

Jaú Gretler gravatar image Jaú Gretler  ( 2023-05-16 08:38:28 -0500 )edit
  • You typically should not specify the current_pose as the first waypoint (the planner will pick that up from the robot_state you pass.)

  • In my opinion, 0.5 mm is an insanely small step length for a plan for a normal-sized arm (and the info message tells you it got rounded up to 1.0 mm anyway.) Try a bigger step.

followed 0.000000% of requested trajectory

If this message shows any value less than 100%, the path planning has failed. Usually it means that the Inverse Kinematic (IK) solver was unable to find joint values within the specified cartesian tolerance of a step's goal. Try increasing the tolerance.

Mike Scheutzow gravatar image Mike Scheutzow  ( 2023-05-16 08:39:26 -0500 )edit

Thanks a lot for your advice! I removed the first waypoint (current_pose) and played around with the step size. A cartesian plan was successfully computed after setting the step size to 9cm. I must admit I don't understand what this step size means exactly. My robot is only about 20cm high, his arms are about 10cm long. I thought that if I set two waypoints at a vertical distance of lets say 5cm, then the cartesian plan would interpolate this distance with maybe 10 points, so each step between interpolated points being around 5mm seemed reasonable to me at the time. But that understanding is obviously wrong, since now a path between two waypoints which are approx. 5cm appart is successfully computed using an eef_step_size of 9cm. So what does that eef_step_size actually refer to?

Jaú Gretler gravatar image Jaú Gretler  ( 2023-05-17 07:45:02 -0500 )edit

compute_cartesian_path() does linear interpolation of the eef pose between the waypoint poses that you passed into the function. This method creates additional waypoints with the distance set by the eef_step_size.

Mike Scheutzow gravatar image Mike Scheutzow  ( 2023-05-18 06:30:04 -0500 )edit

I am aware of that. But isn't it weird that when I set eef_step_size = 9cm it manages to interpolate between points which are less than 9cm appart?

Jaú Gretler gravatar image Jaú Gretler  ( 2023-05-18 08:29:36 -0500 )edit

But anyway, that does not bother me too much, now that it is working. But I have a different question now (I wasn't sure if I should open a new forum entry for that but since I have described my whole setup here I will just continue in this thread).

So as stated above, I wish to set waypoints which the robot eef moves through. It's cool that this works by setting the waypoints in code now, but it would be a lot better if I could set the points visually in rviz and then let them be interpolated. I have found a plugin which promises to do pretty much that, but I cant get it to work. I think it was made for ROS hydro.

Another idea is to make use of the fact that I can move the eef via the moveit motionPlanning interface. Is there ...(more)

Jaú Gretler gravatar image Jaú Gretler  ( 2023-05-18 08:34:59 -0500 )edit

1 Answer

Sort by » oldest newest most voted
0

answered 2023-05-16 09:04:02 -0500

Mike Scheutzow gravatar image

updated 2023-05-18 06:53:01 -0500

My robot has less then 3DOF. It is roughly humanoid and I am working with the left arm at the moment, which has certain joint limits which dictate a certain reachable space.

I missed this comment at first. Yes, it's going to be a problem, especially if you have to rely on cartesian planning. Increasing the cartesian tolerance will get you started, but likely will give imprecise results. If you are seeing IK failures during planning, try increasing the values of GoalPositionTolerance and GoalOrientationTolerance.

If you have to support "random" goals (goals you can't pre-plan for), to rapidly-calculate high-quality paths you may have to create a custom OMPL planner that understands the physical limitations of your robot.

edit flag offensive delete link more

Comments

Thank you for this answer as well. So far the planning has worked well, but I will keep the option of creating a custom OMPL planner in the back of my mind. Thanks!

Jaú Gretler gravatar image Jaú Gretler  ( 2023-05-31 11:56:16 -0500 )edit

Question Tools

Stats

Asked: 2023-05-15 03:56:05 -0500

Seen: 82 times

Last updated: May 18 '23