I'm currently building an analysis application that handles large amounts of data. A typical case would looks like this: the user selects a folder with about 600 measurement files that each contain about 40.000 to 100.000 values. The application reads these values into an object that internally works as a data cache, so that the files must not be read on every access.
This works very well, but I noticed that the memory consumption is very high and it may eventually get too big. During my tests the application crashed when its memory consumption exceeded 2GB of RAM.
The data structure that holds the data is as simple as possible, it basically only consists of some dictionaries that contain the data in a 2-level nested way, nothing complex. I was wondering if there is a convenient way of storing this object in a compressed form in RAM. I know that this would bring down performance, but that is totally acceptable in my case.
Is there a way to do something like that allows me to use my objects as usual? Or do I have to implement compression on my own within my object?
Thanks for your thoughts and recommendations!
List<T>
– Unconcerned.NET 4.5
and a x64 bit process settinggcAllowVeryLargeObjects
in yourapp.config
– Moises