Sometimes the eigenvectors calculated in python and mathematica have opposite signs anyone know why?
Asked Answered
N

2

5

I am trying to calculate the eigenvectors of many 3x3 matrices using python. My code in python is based on code from Mathematica which uses the Eigenvector[] function. I have tried using the eig() function from both numpy and scipy and the majority of the time the eigenvectors calculated in mathematica and python are identical. However, there are a few instances which the eigenvectors calculated in python are opposite in sign to those calculated in mathematica.

I have already tried using numpy and scipy to calculate the eigenvectors which both result in the same problem.

Python Code

    _, evec = eig(COVMATRIX)

Mathematica Code

    evec = Eigenvectors[COVMATRIX]

Note that in Mathematica the eigenvectors are along the rows whereas in python the eigenvectors are along the columns in the matrix.

Given

    COVMATRIX = [[2.9296875e-07, 0.0, 2.09676562e-10], 
                 [0.0, 2.9296875e-07, 1.5842226562e-09], 
                 [2.09676562e-10, 1.58422265e-09, 5.85710e-11]]

The eigenvectors from python are

    [[-7.15807155e-04,  9.91354763e-01, -1.31206788e-01],
     [-5.40831983e-03, -1.31208740e-01, -9.91340011e-01],
     [9.99985119e-01,  2.21572611e-13, -5.45548378e-03]]

The eigenvectors from mathematica are

    {{-0.131207, -0.99134, -0.00545548}, 
     {0.991355, -0.131209, 2.6987*10^-13}, 
     {-0.000715807, -0.00540832, 0.999985}}

In this case the eigenvectors are the same between mathematica and python but...

Given

    COVMATRIX = [[2.9296875e-07, 0.0, 6.3368875e-10],
                 [0.0, 2.9296875e-07, 1.113615625e-09],
                 [6.3368875e-10, 1.113615625e-09, 5.0957159954e-11]]

The eigenvectors from python are

    [[ 2.16330513e-03,  8.69137041e-01,  4.94566602e-01],  
     [ 3.80169349e-03, -4.94571334e-01,  8.69128726e-01], 
     [-9.99990434e-01,  1.11146084e-12,  4.37410133e-03]]

But the eigenvectors from python are opposite in sign to the eigenvectors in mathematica which are below

    {{-0.494567, -0.869129, -0.0043741},  
     {0.869137, -0.494571, 1.08198*10^-12}, 
     {-0.00216331, -0.00380169, 0.99999}}
Nadler answered 20/6, 2019 at 15:34 Comment(1)
May be this is relevantIntort
M
5

If v is an eigen vector then by definition : Av = lambda * v

So if v is an eigen vector then -v is also an eigen vector since : A * (-v) = - A*v = -lambda * v= lambda * (-v)

So both approaches are correct. The goal of eigen vectors is to find non-colinear vectors (which is helpful if you want to diagonalise your matrix). So it doesn't matter if they give you a vector or the opposite vector.

Mcgaw answered 20/6, 2019 at 15:41 Comment(2)
I understand the calculation for the eigenvectors. This problem causes incorrect results later in my calculations. I am wondering why in some cases does python or mathematica return opposite vectors?Nadler
As explained in the answer, there is an infinity of colinear eigenvectors for each eigenvalue, none of them is more 'correct' than any other. If your further calculations depend on the fact that you chose a specific one among them, there must be something wrong in the way you handle them. What are you trying to do exactly?Letendre
O
2

Actually they are only different basis of the same vector space that consisting of all the eigenvectors. The reason of turning up different basis maybe is that the algorithm of python is different from mathematica's.

Oblast answered 20/6, 2019 at 16:8 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.