ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Calculate position using robot_localization

asked 2017-01-23 15:22:22 -0600

b-sriram gravatar image

updated 2017-01-26 07:27:11 -0600

spmaniato gravatar image

Hi,

I have a cart which has a base ( with four wheels) thats front wheel driven by two motors and fitted with encoder (one for each front wheel).

I calculate the distance traveled by counting the ticks and with the help of the heading angle. Hence the distance covered and angle are in Global coordinate frame.

I also calculate the instantaneous X velocity and angular rate which are in the kart co ordinate frame (forward and right is positive).

I also have an IMU razor_imu which gives me the orientation, linear velocity and linear acceleration in the Kart frame (I think??).

When I use the robot_localization package to calculate odom_comined using the following launch file robot_localization.png the orientation seems to be right but my position keeps on adding even when my cart is turned 180. The problem is the code is only accepting velocity values and not the position itself as input from odom message. So if the Cart turns 180, my instantaneously velocity is still positive.

Can someone please suggest a solution?

The measuremnt co variance of the imu is imu_covariance.png and for the encoders encoder_covariance.png

Thanks Sriram

edit retag flag offensive close merge delete

Comments

I see at least a few areas for improvement. I would start here: http://docs.ros.org/indigo/api/robot_...

spmaniato gravatar image spmaniato  ( 2017-01-23 15:31:15 -0600 )edit

I already read that. That's why I've set the initial co variances as follows and other params

b-sriram gravatar image b-sriram  ( 2017-01-23 15:34:48 -0600 )edit

And also take a look at this example: https://github.com/cra-ros-pkg/robot_...

spmaniato gravatar image spmaniato  ( 2017-01-23 15:34:55 -0600 )edit

Hi thanks for the help. but I have not posted the question here without trying all possible solution.

b-sriram gravatar image b-sriram  ( 2017-01-23 15:37:12 -0600 )edit

The documentation I linked to suggests that (in most cases) you shouldn't be fusing both (x, y) and (vx, vy) from odometry. Rather, you should only fuse the linear velocities instead of the position. I would personally do the same for the orientation and angular velocity from odom.

spmaniato gravatar image spmaniato  ( 2017-01-23 15:45:04 -0600 )edit

I tried exactly the same but it didn't help. So I tried with only (x, y) but the node for some reasoon doen't accept it. So I included both.

b-sriram gravatar image b-sriram  ( 2017-01-23 15:47:30 -0600 )edit

OK. The other thing that jumped at me from your question is: "IMU ... gives me the orientation, linear velocity and linear acceleration in the Kart frame" The IMU is most likely reporting data in its own frame. Whether that happens to coincide with your vehicle's frame is a matter of placement.

spmaniato gravatar image spmaniato  ( 2017-01-23 15:54:11 -0600 )edit

Ya what I meant was that was not in the global frame. I have a transform which tells me where the imu is located from the base

b-sriram gravatar image b-sriram  ( 2017-01-23 15:56:47 -0600 )edit

1 Answer

Sort by ยป oldest newest most voted
1

answered 2017-01-26 02:53:06 -0600

Tom Moore gravatar image

Started out as a comment, but will add a few things:

  1. Can you please post (a) your launch files and (b) some sample input messages (as text)?
  2. You said your coordinate frame has forward and right as positive, which is incorrect. REP-103 specifies +X as forward and +Y as left.
  3. Instantaneous velocity, if driving forward, will always be positive, no matter which way your robot is facing. This is correct. The filter converts velocities from the robot's body frame to the world frame internally. As far as the EKF output goes, the ROS nav_msgs/Odometry message specifies that the output pose data is in the world frame (header.frame_id in the message) and the velocities are in the body frame (child_frame_id in the message).
  4. Looking at your configuration image, you are fusing both your calculated pose and velocity from your wheel encoders, but those both come from the same source. You're effectively feeding the filter the same information twice. It doesn't hurt it necessarily, but I'd give it one or the other.
  5. You are fusing yaw from both your IMU and odometry. Those are going to diverge, so you'll have to turn one of them off.
edit flag offensive delete link more

Comments

Hi Tom,

Sorry for the late response. Firstly thanks a ton. My co-ordinate system was defined wrong. I corrected it as you said (+X as forward and +Y as left) and now it seems much better. I'm working on the co-variance matrices to get a better result.

b-sriram gravatar image b-sriram  ( 2017-02-12 11:58:36 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2017-01-23 15:22:22 -0600

Seen: 1,331 times

Last updated: Jan 26 '17