gps/imu and sensor data synchronization and transformations?

asked 2017-10-30 15:57:33 -0500

i_robot_flight gravatar image

Hello,

I have GPS/IMU information read by the ROS driver for a DJI drone (A3 - API) and I am using this information (sensor_msgs/Imu and sensor_msgs/NavSatFix) to transform other sensor data on the robot to the parent frame of the drone as it moves.

  1. To make sure I am moving in the right direction, my first question would have to be this - should the gps frame be the parent frame (UTM converted values) for all sensor information (eg: camera) or would it have to be the drone's body frame?
  2. If I write two publisher nodes that is sending out gps and imu transforms separately to the same frame, are the data time synchronized in ros time by the DJI SDK ROS driver? (If this makes sense)
  3. How do I use a UTM object as a x,y,z point in this particular context?
  4. When the transforms are done (let's say I am able to visualize the live pointclouds moving with the drone on RVIZ), if I store the XYZI pointclouds when the transforms are being sent out and view it together later on pcl_viewer, will these displacements be written to the pointclouds and I will be able to visualize a "map" of the area covered by the drone?

I did extensively read a lot of questions here and ros.wiki documentations, but I just wanted to directly clarify my questions because I am little confused being considerably new to the advanced functions in ROS. If my thinking is wrong, please do correct me and I really appreciate any help or guidance in this regard, thanks :)

  • Sneha
edit retag flag offensive close merge delete