I tried to write matlab code that would decompose a matrix to its SVD form.
"Theory":
To get U, I found the eigenvectors of AA', and to get V, I found the eigenvectors of A'A. Finally, Sigma is a matrix of the same dimension as A, with the root of the eigenvalues on the diagonal in an ordered sequence.
However, it doesn't seem to work properly.
A=[2 4 1 3; 0 0 2 1];
% Get U, V
[aatVecs, aatVals] = eig(A*A');
[~, aatPermutation] = sort(sum(aatVals), 'descend');
U = aatVecs(:, aatPermutation);
[ataVecs, ataVals] = eig(A'*A);
[~, ataPermutation] = sort(sum(ataVals), 'descend');
V = ataVecs(:, ataPermutation);
% Get Sigma
singularValues = sum(aatVals(:, aatPermutation)).^0.5;
sigma=zeros(size(A));
for i=1:nnz(singularValues)
sigma(i, i) = singularValues(i);
end
A
U*sigma*V'
U * sigma * V' seem to be returned with a factor of -1:
ans =
-2.0000 -4.0000 -1.0000 -3.0000
0.0000 0.0000 -2.0000 -1.0000
What's the mistake in the code or "theory" that led to it?