ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

tf transform rotation issue

asked 2011-05-24 06:13:06 -0600

MikeSands gravatar image

Hello all,

Currently we've written a gyro-based odometry node for our robot. We have two encoders which provide the base of the odometry and then we have a gyro for correcting yaw (like when the wheels slip). The node broadcasts an odometry message as well as a stamped transform. The messages are correct as far as position and yaw are concerned. It is the transform that doesn't seem to be working correctly.

The transform tree is

/gyro_odom -> /base_link -> /UTM

When I look at our hokuyo scans in rviz I have the fixed frame and target frame set up so that the scans should stay in basically the same position while the robot moves around inside them. This works for translation but fails miserably for rotation. The scans get smeared rather than the tf tree just turning inside the scans.

Here is the code for broadcasting the transform:

// Now that all the calculations are done, we can pack all the data into
// odometry messages and tf.

// make the quaternion
geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw(_yaw);

// make the tf
geometry_msgs::TransformStamped odom_tf;
odom_tf.header.stamp = _current_time;
odom_tf.header.frame_id = _odom_frame_id;
odom_tf.child_frame_id = _base_link_frame_id;
odom_tf.transform.translation.x = _x;
odom_tf.transform.translation.y = _y;
odom_tf.transform.translation.z = 0.0;
odom_tf.transform.rotation = odom_quat;

// braodcast the tf
_tf_bc.sendTransform(odom_tf);

Does anyone have any ideas as to what might be causing this to fail?

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2011-05-24 06:46:36 -0600

updated 2011-05-24 06:47:07 -0600

If you haven't already, you might want to check out robot_pose_ekf which does pretty much what you're describing: combining odom and imu/gyro data to output a transform that is better than either sensor taken separately.

If you're set on making your own version of robot_pose_ekf--the first thing I would do is visualize the odom and imu rotation vectors separately (and simultaneously) in rviz to make sure your IMU is outputting correct rotations with respect to the odom.

Here's a simple node that inputs an Imu message and outputs a Pose message (for visual debugging)

edit flag offensive delete link more

Question Tools

Stats

Asked: 2011-05-24 06:13:06 -0600

Seen: 1,238 times

Last updated: May 24 '11