ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

tf synchronization

asked 2011-05-19 05:59:57 -0600

miltos gravatar image

Hi! I am trying to synchronize a camera input with an servo with no feedback and the tf library. More specifically:

1) There is a camera (cameraNode) that asynchronously (with a ros::timer) takes pictures and posts them at a ROS topic with a timestamp.

2) There is a node (servoNode) that controls a servo and some other hardware, that has no feedback.

3) There is a node (armNode) that controls a set of motors (and the servo through servoNode) and publishers tf transforms with their positions.

4) There is a node (visionNode) that processes the images and publishers results about the images (with the image timestamp).

5) There is a node (dataNode) that receives the results and time-travels (with tf) to obtain actual points in a coordinate system.

All these seem to work fine individually, but there are times that the camera takes a picture just before the servoNode returns and the armNode publishes the new tf transformation. Then the dataNode when it uses the lookupTransform (with the camera timestamp) gets a transformation that is not correct. How could these be synchronized? Should the armNode stop broadcasting tf transforms (and if yes, how would that change the dataNode's ability for correct transformation)?

Using a tf::MessageFilter doesn't seem right because tf transforms might be sent just before the servos started moving and the camera might take up to 200ms to take a picture. Any suggestions?

edit retag flag offensive close merge delete

2 Answers

Sort by » oldest newest most voted
1

answered 2011-05-19 18:31:45 -0600

tfoote gravatar image

So what I understand is that you have a somewhat open loop system where you cannot know where the system is between commanding a movement before it returns. If this is the case you should probably set a flag and only process camera data from the periods when you know where the servo positions are. There are many different techniques to do this.

  1. You could have a flag to start and stop the camera timer.
  2. Or you could start and stop the image processing based on a similar flag.
edit flag offensive delete link more

Comments

Thanks about that. That's what I did. I create a "RemoteMutex" and the camera is now working fine. Thanks!
miltos gravatar image miltos  ( 2011-05-21 00:04:38 -0600 )edit
0

answered 2011-05-19 06:35:47 -0600

Hi!

This is just a suggestion, but you could take the picture and check for the position of the camera inside the same node (maybe the cameraNode) so it would publish both the picture and the tf with the same timestamp. The dataNode should work in the same way. The idea is to get (for example) a picture when a camera position reading is available, thus synchronizing both.

You could also increase the frequency at which armNode is publishing tf data in oder to reduce your 200ms gap to something that you can live with (if possible, probably that depends on the hardware).

Hope this helps,

Gonçalo Cabrita

edit flag offensive delete link more

Comments

Thanks for the answer. I have thought of that and that's what I am probably going to do. I was looking for a more "elegant" solution tough. Thanks anyway!
miltos gravatar image miltos  ( 2011-05-19 09:52:34 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2011-05-19 05:59:57 -0600

Seen: 1,953 times

Last updated: May 19 '11