Which pcl filter to use to downsample a point cloud
Asked Answered
N

2

6

I am getting a point cloud from a lidar on an autonomous driving robot, but it's too much data to process.

I already implemented a passthrough filter.

I did get a very good result and I was asking myself if there were others filters or methods I could dig into.

Of course I'm not looking for anything specific but rather a direction or advice, because I'm pretty new to the pcl library and it seems pretty huge.

Here is my filter now:

pcl::PointCloud<PointXYZIR>::Ptr cloudInput;

cloudInput.reset(new pcl::PointCloud<PointXYZIR> (cloud_in));

pcl::PointCloud<PointXYZIR>::Ptr cloudFiltered;
cloudFiltered.reset(new pcl::PointCloud<PointXYZIR>);

// Create the filtering object: downsample the dataset using a leaf size
pcl::VoxelGrid<PointXYZIR> avg;
avg.setInputCloud(cloudInput);
avg.setLeafSize(0.25f, 0.25f, 0.25f);
avg.filter(*cloudFiltered);

//Filter object
pcl::PassThrough<PointXYZIR> filter;
filter.setInputCloud(cloudFiltered);

filter.setFilterFieldName("x");
filter.setFilterLimits(-100, 100);
filter.filter(*cloudFiltered);

filter.setFilterFieldName("y");
filter.setFilterLimits(-100, 100);
filter.filter(*cloudFiltered);

cloud_out = *cloudFiltered;
Noyade answered 13/4, 2018 at 14:25 Comment(0)
N
4

Actually I did find a solution, but there is no general solution. In my case, and I think this problem is very specific to which point cloud you gonna get and what you wanna do with it.

The passtrought filter is a very efficient way to downsample without taking too many risk of losing interesting data.

http://pointclouds.org/documentation/tutorials/passthrough.php

Then I tested StatisticalOutlierRemoval, it is efficient but not relevant in my case.

http://pointclouds.org/documentation/tutorials/statistical_outlier.php

Now, I downsample the pointcloud with a leafsize function, then i create a kdtree to filter the point by radius. It's about the same amount of calculation that the passtrought filter took, but it has more sense in my project to do it this way.

// Create the filtering object: downsample the dataset using a leaf size

pcl::VoxelGrid<PointXYZIR> avg;
avg.setInputCloud(cloudInput);
avg.setLeafSize(0.25f, 0.25f, 0.25f);
avg.filter(*cloudFiltered);

//searchPoint
PointXYZIR searchPoint = cloudFiltered->at(0) ;

//result from radiusSearch()
std::vector<int> pointIdxRadiusSearch;
std::vector<float> pointRadiusSquaredDistance;

//kdTree
pcl::KdTreeFLANN<PointXYZIR> kdtree;
kdtree.setInputCloud (cloudFiltered);
kdtree.setSortedResults(true);

if ( kdtree.radiusSearch (searchPoint, 100, pointIdxRadiusSearch, pointRadiusSquaredDistance) > 0 )
{
    //delete every point in target
    for (size_t j = 0; j < pointIdxRadiusSearch.size (); ++j)
    {
        //is this the way to erase correctly???
        cloud_out.push_back(cloudFiltered->points[pointIdxRadiusSearch[j]]);
    }
}
Noyade answered 17/4, 2018 at 14:56 Comment(0)
D
3

voxel grid to downsampling should maintain pretty good cloud distribution while reducing the number of points. You can set how small the voxels are in each axis in order to maintain as much or as little resolution as you want. Each voxel will delete all points in it and replace them with a single point which is averaged from those deleted. http://pointclouds.org/documentation/tutorials/voxel_grid.php#voxelgrid

Delft answered 13/4, 2018 at 16:57 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.