I am running 64 bit R 3.1 in a 64bit Ubuntu environment with 400GB of RAM, and I am encountering a strange limitation when dealing with large matrices.
I have a numeric matrix called A, that is 4000 rows by 950,000 columns. When I try to access any element in it, I receive the following error:
Error: long vectors not supported yet: subset.c:733
Although my matrix was read in via scan
, you can replicate with the following code
test <- matrix(1,4000,900000) #no error
test[1,1] #error
My Googling reveals this was a common error message prior to R 3.0, where a vector of size 2^31-1 was the limit. However, this is not the case, given my environment.
Should I not be using the native matrix type for this kind of matrix?
test[1]
works, as well astest[,1][1]
. Eventest[1:2,1:2]
works, but not the originaltest[1,1]
. – Lithiumff
andbigmemory
packages – ArabianLENGTH(x)
, whereas the block just above it is usingXLENGTH(x)
. As mentioned....it's a work in progress. – GoulashLENGTH
andXLENGTH
. – GoulashR_len_t
(standard vectors) andR_xlen_t
(long support). – Conti