I was using scipy to do sparse matrix svd on some large data. The matix is around 200,000*8,000,000 size, with 1.19% non-zero entries. The machine I was using has 160G memory so i suppose memory shouldn't be an issue.
So here is some code i used:
from scipy import *
from scipy.sparse import *
import scipy.sparse.linalg as slin
from numpy import *
K=1500
coom=coo_matrix((value,(row,col)),shape=(M,N))
coom=coom.astype('float32')
u,s,v=slin.svds(coom,K,ncv=8*K)
The error message is like:
Traceback (most recent call last):
File "sparse_svd.py", line 35, in <module>
u,s,v=slin.svds(coom,K,ncv=2*K+1)
File "/usr/lib/python2.7/dist-packages/scipy/sparse/linalg/eigen/arpack/arpack.py", line 731, in svds
eigvals, eigvec = eigensolver(XH_X, k=k, tol=tol**2)
File "/usr/lib/python2.7/dist-packages/scipy/sparse/linalg/eigen/arpack/arpack.py", line 680, in eigsh
params.iterate()
File "/usr/lib/python2.7/dist-packages/scipy/sparse/linalg/eigen/arpack/arpack.py", line 278, in iterate
raise ArpackError(self.info)
scipy.sparse.linalg.eigen.arpack.arpack.ArpackError: ARPACK error 3: No shifts could be applied during a cycle of the Implicitly restarted Arnoldi iteration. One possibility is to increase the size of NCV relative to NEV.
when K=1000 (i.e. #eigen values=1000) everything is ok. when I try K>=1250 the error begins to appear. I have also tried various ncv values, still get the same error message...
Any suggestions and help appreciated. Thanks a lot :)