Does the scala compiler optimize for memory usage by removing refs to val
s used only once within a block?
Imagine an object holding in aggregate some huge data - reaching a size where cloning data or derivatives of it may well scratch the maximum amount of memory for the JVM/machine.
A minimal code example, but imagine a longer chain of data transforms:
val huge: HugeObjectType
val derivative1 = huge.map(_.x)
val derivative2 = derivative1.groupBy(....)
Will the compiler e.g. leave huge
marked eligible for garbage collection after derivative1
has been computed? or will it keep it alive until the wrapping block is exited?
Immutability is nice in theory, I personally find it addictive. But to be a fit for big data objects that can't be stream-processed item by item on current-day operating systems - I would claim that it is inherently impedance mismatched with reasonable memory utilization, for a big data application on the JVM isn't it, unless compilers optimize for such things as this case..