Any good C or C++ libraries out there for dealing with large point clouds? [closed]
Asked Answered
S

5

7

Basically, I'm looking for a library or SDK for handling large point clouds coming from LIDAR or scanners, typically running into many millions of points of X,Y,Z,Colour. What I'm after are as follows;

Fast display, zooming, panning Point cloud registration Fast low level access to the data Regression of surfaces and solids (not as important as the others)

While I don't mind paying for a reasonable commercial library, I'm not interested in a very expensive library (e.g. in excess of about $5k) or one with a per user run-time license cost. Open source would also be good. I found a few possibilities via google, but they all tend to be too expensive for my budget.

Sabbath answered 18/12, 2009 at 14:38 Comment(0)
A
3

I second the call for R which I interface with C++ all the time (using e.g. the Rcpp and RInside packages).

R prefers all data in memory, so you probably want to go with a 64bit OS and a decent amount of RAM for lots of data. The Task View on High-Performance Computing with R has some pointers on dealing with large data.

Lastly, for quick visualization, the hexbin is excellent for visually summarizing large data sets. For the zooming etc aspect try the rgl package.

Algesia answered 18/12, 2009 at 14:54 Comment(0)
S
6

Check Point Cloud Library (PCL). It is quite a complete toolkit for processing and manipulating point clouds. It also provides tools for point clouds visualisation: pcl::visualization::CloudViewer which makes use of VTK library and wxWidgets

Since 2011, point clout translation (read/write) and manipulating toolkit has been developed: PDAL - Point Data Abstraction Library

Sphygmomanometer answered 24/10, 2011 at 10:8 Comment(1)
I would add that the CloudViewer is somewhat 'simple'. If you need to do something fancy, mostly sure you have to go directly to the PCLVisualizer.Monometallic
A
3

I second the call for R which I interface with C++ all the time (using e.g. the Rcpp and RInside packages).

R prefers all data in memory, so you probably want to go with a 64bit OS and a decent amount of RAM for lots of data. The Task View on High-Performance Computing with R has some pointers on dealing with large data.

Lastly, for quick visualization, the hexbin is excellent for visually summarizing large data sets. For the zooming etc aspect try the rgl package.

Algesia answered 18/12, 2009 at 14:54 Comment(0)
P
2

Why don't you go have a look at the R programming language which can link directly to C code, thereby forming a bridge. R was developed with statistical code in mind but can very easily help not only to handle large datasets but also visualize them as well. There are quite a number of atmospheric scientists who are using R in their work. I know, I work with them for exactly the stuff you're trying to do. Think of R as a poor man's Matlab or IDL (but soon won't be.)

Pyorrhea answered 18/12, 2009 at 14:44 Comment(0)
F
1

In spirit of the R answers, ROOT also provides a good undeling framework for this kind of thing.

Possibly useful features:

  • C++ code base and the Cint c++ interpreter as the working shell. Python binding.
  • Can display three dim point clouds
  • A set of geometry classes (though I don't believe that they support all the operations that you need)
  • Developed by nuclear and particle physicists instead of by statisticians :p
Fascine answered 19/12, 2009 at 2:11 Comment(0)
P
1

Vortex by Pointools can go up to much higher numbers of points than the millions that you ask for:

http://www.pointools.com/vortex_intro.php

It can handle files of many gigabytes containing billions of points on modest hardware.

Pragmatic answered 11/4, 2012 at 15:20 Comment(2)
Vortex is certainly a good engine but the licensing seems pretty expensive and involved from what I can see.Sabbath
I don't know how things are now they have been acquired by Bentley, but it could be worth emailing to see what they can do for you.Pragmatic

© 2022 - 2024 — McMap. All rights reserved.