ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Normal Estimation for Lidar

asked 2022-10-11 11:57:41 -0600

labude gravatar image

updated 2022-10-12 04:18:40 -0600

Hello,

I am trying to estimate normals for a Lidar scan consisting of ~2000 points structured in 8 rings. My input is a sensor_msgs::PointCloud2 which I convert to a pcl::PointCloud<pcl::PointNormal>. The estimation is performed using PCLs NormalEstimation:

pcl::NormalEstimation<pcl::PointNormal, pcl::Normal> ne;
tree = boost::make_shared<pcl::search::KdTree<pcl::PointNormal>>();
ne.setSearchMethod( tree)
ne.setRadiusSearch( 0.3 );
ne.setInputCloud(cloud)
ne.compute(*normals)

This leads to very poor results with the normals pointing in wrong directions, e.g. they should be orthogonal to the walls and not parallel (see picture).image description

I have played around with ne.setRadiusSearch(); and ne.setKSearch() without getting satisfying results.

Does anyone have a suggestion why this fails? It works quite well for other point clouds, e.g. obtained from depth cameras.

Two more ideas:

  • pcl::IntegralImageNormalEstimation seems to be a good alternative, but it needs an organized point cloud as input. My point cloud is unorganized.

  • I can also get the scans as a sensor_msgs/LaserScan message which contains the distance and angle for each point. Maybe I can use this information for the normal estimation? I could not find any solutions that work with this message...

Thanks for your ideas!

EDIT: Using setKSearch(10) improves the result a lot: image description Still the closer points that lie on the walls still have wrong orientations (marked red in the picture). In this area the distance between the scan rings is smaller, but I don't see how this can affect the results - K nearest neighbor search should actually not depend on the distance to the neighbors?

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2022-10-12 03:37:08 -0600

goksankobe gravatar image

hi, could you try to lower the radius search from 30cm to 3cm and see how it affects the orientation of the normals? I'm guessing that a large search radius causes the ground normals to propagate towards the walls.

edit flag offensive delete link more

Comments

Hi, 3cm was a too small radius - the distance between the points is almost always >5cm, so if I set a smaller radius there won't be any normals estimated at all.

I've had more success using KSearch(), see the updated question.

labude gravatar image labude  ( 2022-10-12 04:20:28 -0600 )edit

I think the area where wall points come closer together happens to be the area where they're also closer to the ground points, which disrupts the neighbor search.

As an alternative, you could run a RANSAC plane detection first, and then limit the normal computation to those points within the planes.

goksankobe gravatar image goksankobe  ( 2022-10-12 04:30:39 -0600 )edit

That might be worth a try, thanks. If I find the time I'll also debug the KSearch step by step to see which points are chosen (at least the upper wall points have their closest neighbors all in the wall area as well, so i don't think that ground points play a role for them)

labude gravatar image labude  ( 2022-10-12 04:35:11 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2022-10-11 11:57:41 -0600

Seen: 218 times

Last updated: Oct 12 '22