ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
1

IMU + Odometry Robot Localization Orientation Issue

asked 2019-08-28 13:46:09 -0600

jawsqb gravatar image

updated 2019-08-29 14:56:42 -0600

Hello I am trying to use robot localization package for fusing IMU and Wheel Encoder Odometry such that x and y velocities are taken from odometry data and heading is taken from imu.

However I am getting this issue such that fused localization is not really paying attention to the heading from the IMU. Such that the ekf output just goes down whatever heading the odometry gives while still facing the correct imu direction.

For example, in the picture below , I have my rover going forward and then turn right 90 degrees and then go forward. Blue is wheel odometry (It is very off when coming up with yaw) and the red is the ekf output.

I have also attached a bag file of this test.

image description

Here is my ekf params:

map_frame: map
  odom_frame: odom
  base_link_frame: base_footprint
  world_frame: odom

  odom0: odometry/wheel
  odom0_config: [false, false, false,
                 false, false, false,
                 true,  true,  false,
                 false, false, false,
                 false, false, false]
  odom0_queue_size: 10
  odom0_nodelay: true
  odom0_differential: false
  odom0_relative: false


  imu0: /imu/data
  imu0_config: [false, false, false,
                false,  false,  true,
                false, false, true,
                false,  false,  true,
                false,  false, true]



  imu0_nodelay: false
  imu0_differential: false
  imu0_relative: false
  imu0_queue_size: 10
  imu0_remove_gravitational_acceleration: true

Thoughts?

edit retag flag offensive close merge delete

Comments

Why do you filter out yaw' int the odom0_config? Mybe can you also show an image of the fused odom output?

Here is a link with a a walktrough on configuring the ekf

http://docs.ros.org/melodic/api/robot...

Eisenhorn gravatar image Eisenhorn  ( 2019-08-29 03:32:00 -0600 )edit

The odom fused output is the red arrows, that is fusion of both the odom and the imu.

Reason why I took out the yaw was because the wheel odometry is very off when it comes to turning so I wanted the yaw to be only fused from the IMU.

I have already tried using the params from there however I have the same issue.

jawsqb gravatar image jawsqb  ( 2019-08-29 07:11:17 -0600 )edit

3 Answers

Sort by ยป oldest newest most voted
1

answered 2019-09-19 13:24:14 -0600

jawsqb gravatar image

Just as update, extremely sorry for the lateness in reply.

So i was misinterpreting how ekf was fusing the wheel odometry and the IMU. Despite having orientation correct from IMU the X,Y position odometry from the wheels was very off due to a differential drive odometry calculation node having some wrong parameters.

Specifically the wheel width between the two wheels was too small and I had to increase by almost 2 times to get accurate turning (I did this by just guess and check) . Now the ekf output is matching the actual robot movement.

Also I have ekf also fusing position estimate from an indoor GPS which is now giving a more accurate position estimate.

I am however getting a different set of issues now regarding orientation, the y direction movement is flipped in the global frame... Should I create new post for that issue?

edit flag offensive delete link more

Comments

Is this an answer or an update?

jayess gravatar image jayess  ( 2019-09-19 15:38:30 -0600 )edit
3

answered 2019-08-29 12:10:00 -0600

Try these, I just pulled this out of one of my config files

imu0: /imu
imu0_config: [
    false,  false,  false,  # x, y, z
    true,   true,   true,   # roll, pitch, yaw robot_localization recommends fusing orientation for IMU
    false,  false,  false,  # x vel, y vel, z vel
    false,  false,  false,  # roll vel, pitch vel, yaw vel
    false,  false,  false   # x accel, y accel, z accel (ideal want true true false)
]

odom0: /odom
odom0_config: [
    false,  false,  false,  # x, y, z
    false,  false,  false,  # roll, pitch, yaw
    true,   true,   true,   # x vel, y vel, z vel
    false,  false,  false,  # roll vel, pitch vel, yaw vel
    false,  false,  false   # x accel, y accel, z accel
]
edit flag offensive delete link more
1

answered 2019-09-05 07:39:49 -0600

SamsAutonomy gravatar image

updated 2019-09-05 07:44:26 -0600

Four questions:

1): Do you have a second EKF estimating map_frame:? If you do, start with only one EKF for now. From the documents:
Note If your system does not have a map_frame, just remove it, and make sure world_frame is set to the value of odom_frame.

2): Why are you fusing in Z velocity from your IMU into the EKF?

imu0: /imu/data
imu0_config: [false, false, false,
                    false,  false,  true,
                    false, false, true, # This one here?
                    false,  false,  true,
                    false,  false, true] # Are you using this value? ie: two_d_mode: true? Doesn't look like you are.

I don't know if there is any IMU that reports Z velocity, but if yours does I would be interested to know.

Also, are you estimating Z position/velocity? If no sensor is correcting the Z accelerations and your two_d_mode is somehow being set to true you might see interesting results in Z.

3) You are fusing in Z angular velocity, what IMU driver are you using? I have found that some drivers report angular velocity and orientation in two different methods. It is incredibly hard to properly mount the IMU and have them physically match. (This is due to flipping from NED to ENU by doing something like this: X->Y Z->-Z) the proper way is to do a Quaternion multiply: https://github.com/ros-drivers/micros...

4) I don't see the attached bagfile, am I missing it somewhere?

edit flag offensive delete link more

Question Tools

3 followers

Stats

Asked: 2019-08-28 13:46:09 -0600

Seen: 3,211 times

Last updated: Sep 19 '19