I'm working on a single machine with 512GB RAM (addressed by several AMD Opteron 6212 CPUs). There is currently about 300GB RAM free. Running a large java computation by running
java path/to/myApp -Xms280g -Xmx280g > output.txt
should make Java reserve 280GB immediately, and error if that fails. Strangely, no error occurs but top
only shows a memory usage of 30.4GB but it doesn't crash. How can this happen? Isn't java supposed to crash if the initial heap size cannot be allocated?
And effectively, I get OutOfMemory/Java heap space/GC overhead limit errors once the 30.4GB are full, well before the 280GB is ever reached. Running with 250GB or 300GB yields a similar 30.3GB ~ 30.4GB limit. I'm running OpenJDK 64-bit server VM with OpenJDK Runtime Environment (IcedTea6) on Gentoo Linux, and there is plenty of free RAM (over 300GB).
java -version
? Also, you may find this useful. – Rokachfree -m
only drops by 60MB. That doesn't really help, but it could at least partially explain why it's not allocating all of it(using HotSpot 1.7.0_25 on Debian). Quick thought here: Are you sure you can allocate 30+GB of memory?(Checkulimit -a
) – Lyonnesseulimit -a
lists max memory size and virtual memory both as unlimited. The commandfree -m
shows the same resuls as the ones displayed intop
. – Ischiatop only show the physical memory, but the kernel could pass some pages to swap partition. use jconsole to check the real jvm memory usage
(since he posted this as an answer and not as a comment it was deleted - that's why you can't see it). – Talky