Eigen and SVD to find Best Fitting Plane given a Set of Points
Asked Answered
M

2

8

Given a set of N points in a 3D space, I am trying to find the best fitting plane using SVD and Eigen.

My algorithm is:

  1. Center data points around (0,0,0).
  2. Form 3xN matrix of point coordinates.
  3. Calculate SVD of the matrix.
  4. Set the smallest singular vector corresponding to the least singular value as normal of the plane.
  5. Set distance from origin to the plane as normal∙centroid.

I can't figure out how to use Eigen's SVD Module to find the smallest singular vector corresponding to the least singular value of point coordinates matrix.

So far I have this code (steps 1, 2 and 5 of the algorithm):

Eigen::Matrix<float, 3, 1> mean = points.rowwise().mean();
const Eigen::Matrix3Xf points_centered = points.colwise() - mean;

int setting = Eigen::ComputeThinU | Eigen::ComputeThinV;
Eigen::JacobiSVD<Eigen::Matrix3Xf> svd = points_centered.jacobiSvd(setting);

Eigen::Vector3d normal = **???**

double d = normal.dot(mean);
Maxillary answered 7/9, 2016 at 12:47 Comment(4)
What code have you tried so far?Palatalized
Added code to the question.Maxillary
See slides 97-99 of this presentation.Affine
@Affine I would like to use SVD, I read that should be more precise in some cases.Maxillary
S
4

Denoting U = svd.matrixU(), the vectors U.col(0) and U.col(1) defines a base of your plane and U.col(2) is normal to your plane.

U.col(0) also defines the direction with the greatest standard deviation.

Sarcophagus answered 7/9, 2016 at 15:55 Comment(2)
Could you please explain a bit more about why the flag ComputeFullU is used instead of ComputeThinU? According to eigen.tuxfamily.org/dox/classEigen_1_1JacobiSVD.html, ComputeThinU will calculate the first three singular vectors since we have a 3xN matrix. And the third one is what we need. Thus I'm a little confused here.Kultur
This is a rather old answer. Now that I re-read this, I indeed do not see why ComputeFullU is necessary.Sarcophagus
M
1

Your problem is basically how to do a least-square fitting using the Eigen JacobiSVD module. Here's a link with a more helpful example. The basic idea of least-square fitting is that you first take the vector difference of all the N-1 points with one of the N points, and then try to approximate all such N-1 vectors as a linear combination of two basis vectors, which define the 2D plane.

Model answered 7/9, 2016 at 13:4 Comment(4)
This link is about normal least-square fitting, whereas the question is about total least squares.Affine
I realize my algorithm is not optimal. It makes more sense to put the centroid on the best-fit plane, rather than one of the N points. I just derived mathematically using Lagrange multipliers that the algorithm given in the question already minimizes the sum of squares of the N point-to-plane distances. This is a coordinate-independent conclusion. After the derivation, there's no more least-square to do algorithmically. The correct code to implement the algorithm is given in billx's answer.Model
My example does the normal least-square fitting, in which an over-determined linear equation Ax=b is solved using the least-square formula x=(A'*A)^(-1)*A'*b. The implementation is more clever. First an SVD is done to A=USV' and then the solution is x=V*S^(-1)*U'*b. This avoids inverting A'*A, which has a squared condition number compared with A itself.Model
This question is more interesting than I previously thought. A least-square-looking problem eventually doesn't need to do any least squares. Let me give it a one-up!Model

© 2022 - 2024 — McMap. All rights reserved.