Ok, as promised here a simple solution:
Notation: Let d_{i,j}=d_{j,i}
denote the squared distance between points i
and j
. Let N
be the number of points. Let p_i
denote the point i
and p_{i,k}
the k
-th coordinate of the point.
Let's hope I derive the algorithm correctely now. There will be some explanations afterwards so you can check the derivation (I hate it when many indexes appear).
The algorithm uses dynamic programming to arrive at the correct solution
Initialization:
p_{i,k} := 0 for all i and k
Calculation:
for i = 2 to N do
sum = 0
for j = 2 to i - 1 do
accu = d_{i,j} - d_{i,1} - p_{j,j-1}^2
for k = 1 to j-1 do
accu = accu + 2 p_{i,k}*p_{j,k} - p_{j,k}^2
done
p_{i,j} = accu / ( 2 p_{j,j-1} )
sum = sum + p_{i,j}^2
done
p_{i,i-1} = sqrt( d_{i,0} - sum )
done
If I have not done any grave index mistakes (I usually do) this should do the trick.
The idea behind this:
We set the first point arbitrary at the origin to make our life easier. Not that for a point p_i
we never set a coordinate k
when k > i-1
. I.e. for the second point we only set the first coordinate, for the third point we only set first and second coordinate etc.
Now let's assume we have coordinates for all points p_{k'} where k'<i
and we want to calculate the coordinates for p_{i}
so that all d_{i,k'}
are satisfied (we do not care yet about any constraints for points with k>k'
). If we write down the set of equations we have
d_{i,j} = \sum_{k=1}^N (p_{i,k} - p_{j,k} )^2
Because both p_{i,k}
and p_{j,k}
are equal to zero for k>k'
we can reduce this to:
d_{i,j} = \sum_{k=1}^k' (p_{i,k} - p_{j,k} )^2
Also note that by the loop invariant all p_{j,k}
will be zero when k>j-1
. So we split this equation:
d_{i,j} = \sum_{k=1}^{j-1} (p_{i,k} - p_{j,k} )^2 + \sum_{k=j}^{k'} p_{i,j}^2
For the first equation we just get:
d_{i,1} = \sum_{k=1}^{j-1} (p_{i,k} - p_{j,k} )^2 + \sum_{k=j}^{k'} p_i{i,1}^2
This one will need some special treatment later.
Now if we solve all binomials in the normal equation we get:
d_{i,j} = \sum_{k=1}^{j-1} p_{i,k}^2 - 2 p_{i,k} p_{j,k} + p_{j,k}^2 + \sum_{k=j}^{k'} p_{i,j}^2
subtract the first equation from this and you are left with:
d_{i,j} - d_{i,1} = \sum_{k=1}^{j-1} p_{j,k}^2 - 2 p_{i,k} p_{j,k}
for all j > 1.
If you look at this you'll note that all squares of coordinates of p_i are gone and the only squares we need are already known. This is a set of linear equations that can easily be solved using methods from linear algebra. Actually there is one more special thing about this set of equations: The equations already are in triangular form, so you only need the final step of propagating the solutions. For the final step we are left with one single quadratic equation that we can just solve by taking one square root.
I hope you can follow my reasoning. It's a bit late and my head is a bit spinning from those indexes.
EDIT: Yes, there were indexing mistakes. Fixed. I'll try to implement this in python when I have time in order to test it.