ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
2

Absolute localization using GPS + IMU + visual odometry

asked 2020-09-04 07:40:31 -0600

felixN gravatar image

Hi, I'm rather new to ROS, and completly new to the localization using ROS.

I'm working on a robot with 3 sensors usefull for localization :

1) A RTK-GNSS GPS from ardusimple (simpleRTK kit), providing very accurate position (precision=2cm) in good very good conditions, not so good ones if there are too many obstacles blocking the view of satelites. It is outputting the position in the topic /gps/fix at 1Hz as a sensor_msgs/NavSatFix (and some other messages specific to the ublox_gps node, including more detailed information about the type of corrections we are getting).

2) A stereo camera (Zed2 from Stereolabs), which provides visual odometry ( topic : /zed2/zed_node/odom ; Type: nav_msgs/Odometry). Alternatively, it also provide also /zed2/zed_node/pose (type : geometry_msgs/PoseStamped) and /zed2/zed_node/pose_with_covariance (type : geometry_msgs/PoseWithCovarianceStamped)

3) An IMU integrated in the camera (the magnetometer in it is unsusable right now, because it is sitting just next to a speaker with a big magnet). We ordered a separated magnetometer to put it in a better location. The output topic is /zed2/zed_node/imu/data (of type sensor_msgs/Imu)

My goal is to fuse this data, in order to get an accurate absolute position at all time (ideally in the form of UTM coordinates or longitude/latitude).

One significative contrain is that I will not have permanent GPS signals (I will filter the GPS data to keep it only when in "fixed" mode, ie when accuracy is very good). This step will be implemented later on (for a start, I just do the tests where I know the signal will be good).

So my main question is : how to fuse this data and retrieve real time absolute position?

So far, I tried the solution my predecessor started looking into :

  • using 2 instances of ekf_localization_node of the robot_localization package (one for fusing the IMU with the visual odometry to get the base_link to odom transform ; and a second one fusing the IMU, the visual odometry and the GPS data in order to get the base_link to map transform (wich is output as a odom to map transform because baselink can't have 2 parents)

  • using one instance of navsat transform node (from robot_localization package) in order to get the GPS data into an odometry format usable in the ekf_localization_node and outputing a (static!) map->utm transform

If I understand well, the baselink->utm transform should correspond to the absolute position of the robot. However, when I tested, with the robot immobile with decent GPS precision, I got completly unstable output (drift of >10m per second).

As can't attach files yet, I put those I think might be relevant on a google drive : https://drive.google.com/drive/folder...

So do you have an idea either :

  • how to get my current setup working

  • or another way that would be easier to get the absolute location of the robot fusing all data?

Thank you very much in advance

Felix

PS : I'm using ros melodic on a ... (more)

edit retag flag offensive close merge delete

3 Answers

Sort by ยป oldest newest most voted
0

answered 2021-02-12 06:45:21 -0600

Hi Felix. Did you solve it? Did the localization accuracy improved after combining Zed and GNSS? I am interested on building the same set-up that you did. I am new in ROS and your experience would be very helpful for me. Have you made the code publicly available or are you planning to do so? Thank you in advance :)

edit flag offensive delete link more

Comments

Hi,

the more time I spent using GNSS, the more I realised that in urban environment it don't work well (when enough satelites are visible I got very good results, but far too often I got very bad results (>2m error)). So even with fusing other sensors like the zed2, it wouldn't have been acurate enough to stay on the paths in many situation (for the first testing site we might have managed it, but for the second one with more trees it would have been nearly impossible).

In addition, the map we got for the first map was not as precise as expected, so we would have had to redo all the mapping.

So in the end, we replaced the GNSS with april tags, which also removes the need for an absolute map.

Good luck

Felix

felixN gravatar image felixN  ( 2021-02-23 04:25:46 -0600 )edit
0

answered 2020-09-04 13:40:44 -0600

mirella melo gravatar image

Hey! Does this board provide a 2cm precision from GPS? It may need a very ideal environment for that. An advice (and I hope I'm wrong): don't get impressed about such information and expected for much lower precision (as you have already experienced). rs.

About your main question... I can't imagine a trivial answer for such a complex inquiry. If I understood right, this LED2 already processes a SLAM system and outputs the odometry information, right? You said you're working with LED2, and according to the StereoLabs website, the SLAM system considers both visual and IMU information, which means that the unique fusion still not applied is with this GPS information. Do you intend to do this fusion in order to achieve a higher precision?

I'm not sure with this is the case, but this paper made a fusion with ORB-SLAM2 and GPS, not to get higher precision, but to global mapping and reuse of maps. In there, they explained the way they fusion such information. Maybe can be useful for you.

edit flag offensive delete link more

Comments

Hi, For your answer. For the GPS : when in "fixed" mode, about 95% of data is within a 2cm raduis of average (accuracy might be a litle bit worse, but still within 20cm, which is good enough). I will make an aditionnal node in order to keep GPS data only when in this mode.

For the ZED2 (not LED2) camera, it can do slam, but for now we only use the visual odometry mode (ie computing displacement based on the stereo camera). The ZED2 node can fuse direclty the IMU data (not done yet, as I was thiniking to fuse it in the kalman filter, but can be done).

The goal is NOT to achive higher accuracy than the GPS when in fixed mode (an error of 0.5m is always OK, most of the time 1m is fine). The problem is that in some places, I don't get ...(more)

felixN gravatar image felixN  ( 2020-09-07 02:46:08 -0600 )edit
0

answered 2021-12-01 11:55:27 -0600

martinlucan gravatar image

Hello @felixN , does your ZED + IMU fusion work properly? I am working basically on the same setup, but the output from robot_localization seems to actually follow the /zed/zed_node/odom with no correction from IMU. Thank you for the response.

ZED2 with integrated IMU is supposed to internally fuse the visual odometry with IMU, isn't it?

edit flag offensive delete link more

Question Tools

2 followers

Stats

Asked: 2020-09-04 07:39:24 -0600

Seen: 1,786 times

Last updated: Dec 01 '21