Our current caching implementation caches large amounts of data in report objects (50MB in some cases).
We’ve moved from memory cache to file cache and use ProtoBuf to serialize and de-serialize. This works well, however we are now experimenting with Redis cache. Below is an example of how much longer it takes for Redis than using the file system. (Note: using protobuf instead of JsonConvert improves set time to 15 seconds and get time to 4 seconds in the below example, when setting a byte array).
// Extremely SLOW – caching using Redis (JsonConvert to serialize/de-serialize)
IDatabase cache = Connection.GetDatabase();
// 23 seconds!
cache.StringSet("myKey", JsonConvert.SerializeObject(bigObject));
// 5 seconds!
BigObject redisResult = JsonConvert.DeserializeObject<BigObject>(cache.StringGet("myKey"));
// FAST - caching using file system (protobuf to serialize/de-serialize)
IDataAccessCache fileCache = new DataAccessFileCache();
// .5 seconds
fileCache.SetCache("myKey",bigObject);
// .5 seconds
BigObject fileResult = fileCache.GetCache<BigObject>("myKey");
Thanks in advance for any help.
ps. I didn’t find an answer from similar questions asked. Caching large objects - LocalCache performance
or