Basically we've noticed that on some computers setting the JVM option -Xmx (max heap size) sometimes cause the JVM to fail to initialize, even if there's more than adequate RAM on the system.
So for example, on a 4gb machine, we have -Xmx1024m which fails but -Xmx800m works. I could understand on a 1gb machine, even a 2gb machine, but on a 4gb machine, especially considering that Windows, Linux, etc. can swap the RAM out, why does this fail?
I've seen a lot of threads and questions saying to reduce your max heap size, but no one can explain why it fails which is what I'm really looking for.
As well, how do you say consume as much memory as you want up to a certain size then?
-Xmx
to terabytes even though they haven't got nearly as much swap space. Could you please share the exact commands and JVM versions you tested? – Candicandia-Xmx
sets the maximum heap size, which is simply the size of the reserved range of virtual address space; only the amount specified in-Xms
is actually backed by committed storage. For example, seeVirtualAlloc
and compareMEM_RESERVE
andMEM_COMMIT
flags. – Dimeter-Xmx
value), but only maps actual pages of RAM and/or swap into that address space as it needs them (starting with the-Xms
value). This is an implementation limitation resulting from optimization -- it allows the entire Java heap to be treated like a single gigantic array by the garbage collector. – Dimeter