the correlation matrix is so large (50000by50000) that it is not efficient in calculating what I want. What I want to do is to break it down to groups and treat each as separate correlation matrices. However, how do I deal with the dependence between those smaller correlation matrices? I have been researching online all day but nothing comes up. There should be some algorithm out there that is related to the approximation of large correlation matrices like this, right?
Can I break down a large-scale correlation matrix?
Asked Answered
Could you be a little more specific about what exactly you're trying to achieve with the matrix? –
Frissell
to generate 50000*200 dependent variables by multiplying independent random variables by the Cholesky decomposition. –
Lava
Even a 4 x 4 correlation matrix is sensitive to errors. In any case, here are some links that might help:
http://www.oxford-man.ox.ac.uk/documents/papers/2011OMI08_Sheppard.pdf
http://www.kevinsheppard.com/images/4/47/Chapter8.pdf
http://arxiv.org/PS_cache/arxiv/pdf/1009/1009.5331v1.pdf
http://cran.r-project.org/web/packages/tawny/index.html
http://www.rinfinance.com/RinFinance2009/presentations/yollin_slides.pdf
http://nurometic.com/quantitative-finance/tawny/portfolio-optimization-with-tawny
http://quantivity.wordpress.com/2011/04/17/minimum-variance-portfolios/
© 2022 - 2024 — McMap. All rights reserved.