ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Kalman Filter implementation for a drone. Help confirming my sensor inputs are correct.

asked 2018-08-13 16:50:00 -0600

marinePhD gravatar image

updated 2018-08-27 04:46:21 -0600

Tom Moore gravatar image

Hi, I'm learning more about the kalman filter and I thought I'd write a simple implementation for position tracking based on the details at this link and this link

However, The filter estimates and the real sensor readings are far off. I'm using a Matrice 100 drone and subscribing to the following messages to populate my state estimate:

/dji_sdk/imu ( publishes imu data)
/dji_sdk/velocity ( publishes velocity)
/dji_sdk/local_position ( publishes local position in Cartesian coordinates)

My state estimate is a 9 x 1 vector tracking position, velocity and acceleration

x << pos_x, pos_y, pos_z, vel_x, vel_y, vel_z, acc_x, acc_y, acc_z;

I populate these variables using the following which I get from the callbacks above:

pos_x = local_position.point.x;
pos_y = local_position.point.y;
pos_z = local_position.point.z;
vel_x = current_velocity.vector.x;
vel_y = current_velocity.vector.y;
vel_z = current_velocity.vector.z;
acc_x = current_acceleration.linear_acceleration.x;
acc_y = current_acceleration.linear_acceleration.y;
acc_z = current_acceleration.linear_acceleration.z;

I defined the rest of the filter as

 MatrixXd A(9,9);
    A <<    1, 0, 0, dt, 0, 0, dt_squared, 0 , 0,
            0, 1, 0, 0,  dt, 0, 0, dt_squared, 0,
            0, 0, 1, 0, 0, dt, 0, 0, dt_squared,
            0, 0, 0, 1, 0, 0, dt, 0, 0,
            0, 0, 0, 0, 1, 0, 0 , dt, 0,
            0, 0, 0, 0, 0, 1, 0, 0, dt,
            0, 0, 0, 0, 0, 0, 1, 0, 0,
            0, 0, 0, 0, 0, 0, 0, 1, 0,
            0, 0, 0, 0, 0, 0, 0, 0, 1;

    // State Covariance Matrix
MatrixXd P(9,9);
P<< 100, 0, 0, 0, 0, 0, 0, 0, 0,
        0, 100, 0, 0, 0, 0, 0, 0, 0,
        0, 0, 100, 0, 0, 0, 0, 0, 0,
        0, 0, 0, 100, 0, 0, 0, 0, 0,
        0, 0, 0, 0, 100, 0, 0, 0, 0,
        0, 0, 0, 0, 0, 100, 0, 0, 0,
        0, 0, 0, 0, 0, 0, 100, 0, 0,
        0, 0, 0, 0, 0, 0, 0, 100, 0,
        0, 0, 0, 0, 0, 0, 0, 0, 100;


MatrixXd H = MatrixXd::Identity(3,9);

  // Measurement noise Matrix. 
 MatrixXd R(3,3);
R << 0.1, 0, 0,
     0, 0.1, 0,
     0, 0, 0.1;

and I also defined an appropriate Q Matrix based on the CA model as defined in the second link.

I initialise the filter:

One of the things I'm not sure of is the values of my measurements vector z . I've defined it as

z << pos_x , pos_y , pos_z;

which is also technically the first three elements of my state vector. Is this correct?

I then run the filter at 100Hz which is the rate at which the drone publishes the messages but my results are shown at this link:)

Is anyone able to help share some pointers on this? Also, here's a link to my current source code if anyone is willing to take a look and see what I'm doing wrong.

edit retag flag offensive close merge delete

Comments

Are you not tracking the orientation of the drone? Without that information your 3D values are meaningless.

PeteBlackerThe3rd gravatar image PeteBlackerThe3rd  ( 2018-08-14 06:33:58 -0600 )edit

@PeteBlackerThe3rd, Currently not. Part of current work is to determine if the discrete kalman filter was sufficient to track position in 3D if your sensor gave you 3D positions and compare against a package like robot_localization for example. Take this paper

marinePhD gravatar image marinePhD  ( 2018-08-14 06:41:40 -0600 )edit

@PeteBlackerThe3rd as an example, There work is similar to the stuff I'm trying to validate and they don't seem to be tracking the orientation of the drone.

marinePhD gravatar image marinePhD  ( 2018-08-14 06:42:34 -0600 )edit

@PeteBlackerThe3rd Thanks for your comment, I set an identity matrix of 0.05 and played with the standard deviation for my Q for position and the filter performs close to what I expected.

marinePhD gravatar image marinePhD  ( 2018-08-14 08:15:42 -0600 )edit

1 Answer

Sort by ยป oldest newest most voted
1

answered 2018-08-14 07:55:25 -0600

In order for their experiment to work then the three sets of XYZ values need to be in the same frame. In your case here the IMU may be publishing linear acceleration data in a different coordinate system to the local_position data. This will make life impossible for the Kalman filter.

edit flag offensive delete link more

Question Tools

1 follower

Stats

Asked: 2018-08-13 16:48:31 -0600

Seen: 714 times

Last updated: Aug 13 '18