Why does setting the -Xmx too high sometimes cause the JVM to fail, even if there's available RAM?
Asked Answered
B

2

7

Basically we've noticed that on some computers setting the JVM option -Xmx (max heap size) sometimes cause the JVM to fail to initialize, even if there's more than adequate RAM on the system.

So for example, on a 4gb machine, we have -Xmx1024m which fails but -Xmx800m works. I could understand on a 1gb machine, even a 2gb machine, but on a 4gb machine, especially considering that Windows, Linux, etc. can swap the RAM out, why does this fail?

I've seen a lot of threads and questions saying to reduce your max heap size, but no one can explain why it fails which is what I'm really looking for.

As well, how do you say consume as much memory as you want up to a certain size then?

Barbaraanne answered 5/1, 2012 at 23:2 Comment(5)
A quick search brings users who are surprised that JVM allows them to set up -Xmx to terabytes even though they haven't got nearly as much swap space. Could you please share the exact commands and JVM versions you tested?Candicandia
@alf, keep in mind that -Xmx sets the maximum heap size, which is simply the size of the reserved range of virtual address space; only the amount specified in -Xms is actually backed by committed storage. For example, see VirtualAlloc and compare MEM_RESERVE and MEM_COMMIT flags.Dimeter
@JeffreyHantin I do, Jeffrey, I do. That's why I was asking what exactly OP did.Candicandia
Oracle's JDK 6. But I guess a better question is how do you say take as much memory as you need?Barbaraanne
@StephaneGrenier The standard JRE cannot automatically grow the heap by allocating new address space -- it has to preallocate the entire heap address range (the -Xmx value), but only maps actual pages of RAM and/or swap into that address space as it needs them (starting with the -Xms value). This is an implementation limitation resulting from optimization -- it allows the entire Java heap to be treated like a single gigantic array by the garbage collector.Dimeter
D
13

It's possible that this is due to virtual address space fragmentation. It may not be possible to reserve a contiguous 1024MB address range for the maximum potential size of the heap, depending on the load addresses of DLLs, threads' stack locations, immovable native memory allocations, kernel reserved addresses and so forth, especially in a 32-bit process.

Dimeter answered 5/1, 2012 at 23:11 Comment(0)
P
1

I came across this issue a while ago with Windows XP. On most XP machines I could allocate 1400MB, while others were only 1200MB. The consensus was fragmentation as Jeffrey Hantin says in the other answer.

Prelate answered 5/1, 2012 at 23:34 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.