My test code:
int SIZE = 1900;
int[][] array = new int[SIZE][];
for (int i = 0; i < SIZE; i++) {
array[i] = new int[1024 * 1024 / 4]; // 1MB
Thread.sleep(10);
if (i % 100 == 0 && i != 0) {
System.out.println(i + "Mb added");
}
}
I launch it with arguments in java 8 -Xmx2048m -XX:+UseG1GC -XX:+PrintGCDetails
And it fails with OutOfMemory when only 1G is consumed.
Heap
garbage-first heap total 2097152K, used 1048100K [0x0000000080000000, 0x0000000080104000, 0x0000000100000000)
region size 1024K, 1 young (1024K), 0 survivors (0K)
Metaspace used 3273K, capacity 4496K, committed 4864K, reserved 1056768K
class space used 358K, capacity 388K, committed 512K, reserved 1048576K
I see that G1 allocated size is 2G and I suppose JVM is trying to allocate more and fails with OOM. But why is it trying to allocate more if half of the memory is free?
With UseConcMarkSweepGC
it's working fine and array was fully filled .
total 2097152K, used 1048100K
. I checked that both on Windows and Linux. The behaviour is the same – Demurralarray[i] = new int[1024 * 1024 / 4];
. Here it's the place that definitely causes OOM. But usually OOM can be thrown from any place that dosn't have anything to do with the root cause of that OOM. – Demurral