ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Confused with setup for robot_localization

asked 2018-08-13 13:45:26 -0600

nprof50 gravatar image

Hello,

I've just recently started trying to implemented the robot_localization node, and I'm having trouble understanding what all to enable, and what I should be expecting back. Here is my current setup:

  • I have a robot which reports its odometry as (x, y, yaw) based on ticks from the wheel's encoder. It also has an IMU that is reporting (roll, pitch, yaw).
  • I am attempting to fuse the (x, y, yaw) odometry information with the (yaw) information from the IMU.
  • I haven't really tweaked covariances or anything yet, I am mostly using the default values from the template.

What I am not understanding:

  • The wheel odometry is inherently going to be incorrect due to wheel slippage, so while the distance the robot traveled is still pretty accurate, the odometry value for yaw is not to be trusted after a long time. But x, y for odometry are calculated using yaw, so I guess those would be wrong as well. Am I supposed to be taking the result of the robot_localization output and "resetting" the state of the odometry (specifically yaw) of the wheels? I'm not sure if this is something the node handles internally or not.
  • My main concern is that odometry reports being at (5, 5, 45 degrees) after a bit of movement, and the IMU reports yaw is actually 180. Then the X, Y that the odometry reported would also be wrong. How does this node handle a situation like that? Would it somehow understand to adjust the X, Y values based on the IMU yaw reading being the correct one?
  • Should I be passing more information to the node? I am only passing x, y, yaw (odometry), and yaw (IMU). I can't really understand how it can take that information and correct the x, y position relative to what the IMU is saying. If the IMU and odometry disagree entirely on the yaw value, what happens? I.e. what happens to the X and Y values when my odometry says I'm turned 45 degrees and the IMU says I'm at 180 degrees?
  • I saw somewhere that I shouldn't be fusing X, Y of the odometry, instead I should fuse veloicty_x, velocity_y. But in regards to my first bullet point in this list, wouldn't the velocities be subject to the same issues as position? The yaw known by odometrty would be used to calculate the velocities as well, thus giving the same error as the position if the yaw is wrong.
  • When I look at the transform in rviz, I notice that if I lift the robot up off the floor and turn it 180 degrees, the transform does not change. It's almost as if it's completely ignoring the IMU yaw position and setting itself purely with the odometry information. Why would I not see a change in the transform if I pick up the robot and rotate it to change the IMU value?

I don ... (more)

edit retag flag offensive close merge delete

Comments

have you figured out how to configure robot_localization for odometry + imu? i am currently stuck on this issue (See my latest question)

aarontan gravatar image aarontan  ( 2018-09-10 18:55:59 -0600 )edit

1 Answer

Sort by ยป oldest newest most voted
2

answered 2018-08-14 12:58:05 -0600

autonomy gravatar image
  1. Why do you say (x,y) is calculated using yaw? It should be the other way around - encoder counts from each wheel input into a formula that takes wheel base and wheel radius into account https://www.cs.princeton.edu/courses/... . Assuming a skid-steer platform here. You shouldn't do anything to the wheel odometry like resetting it - you'll just be building up error that way. It would be like running a second EKF to process the results of the first one - no benefits.

  2. What is the yaw reading from the IMU when you start your robot, is it 0? Where does the robot actually point in relation to the starting point when odometry is reading 45 (i.e. how close is that reading to the truth)? Read http://docs.ros.org/melodic/api/robot... "This setting is especially useful if your robot has two sources of absolute pose information, e.g., yaw measurements from odometry and an IMU". IMU orientation is usually absolute.

  3. It depends on what you're trying to achieve. In a nutshell, the filter will not correct (x,y) using yaw. You can try using accelerations from the IMU or velocities, if it gives you that, but that data is likely to be extremely noisy. Readings from multiple sensors are weighted, so if there's a disagreement, the noise will be incorporated into the estimate but hopefully you've set your covariances right and it won't affect the estimate too much.

  4. I'd say there's no right or wrong, it depends on what data you are dealing with. Your encoders are your only source of data so whether you're calculating position or velocity, those values will depend on the accuracy of the encoders. It's just that you might want to integrate velocities instead of absolute measurements for convenience reasons. It might be applicable in your case and you might want to use the differential parameter (see 2) - the two estimations of yaw may differ greatly over the long run but for an instantaneous update the difference might not be much.

  5. This goes back to a question in your second bullet point of "Would it somehow understand to adjust the X, Y values based on the IMU yaw reading being the correct one?". You need to tell the filter that you trust the IMU's heading more, and you do that with covariance values. The covariance value for yaw from the IMU should be lower than the covariance value for yaw from odometry. So you're likely correct in saying that the filter is not actually using the IMU.

P.S. Can't make this a wiki post like on StackOverflow. Paging @tom-moore for a less hand-wavy answer.

edit flag offensive delete link more

Question Tools

1 follower

Stats

Asked: 2018-08-13 13:45:26 -0600

Seen: 701 times

Last updated: Aug 14 '18