Loading a large hprof into jhat
Asked Answered
M

4

25

I have a 6.5GB Hprof file that was dumped by a 64-bit JVM using the -XX:-HeapDumpOnOutOfMemoryError option. I have it sitting on a 16GB 64-bit machine, and am trying to get it into jhat, but it keeps running out of memory. I have tried passing in jvm args for minimum settings, but it rejects any minimum, and seems to run out of memory before hitting the maximum.

It seems kind of silly that a jvm running out of memory dumps a heap so large that it can't be loaded on a box with twice as much ram. Are there any ways of getting this running, or possibly amortizing the analysis?

Miche answered 2/12, 2009 at 20:55 Comment(1)
See also #7254517Astute
I
21

I would take a look at the eclipse memory analyzer. This tool is great, and I have looked at several Gig heaps w/ this tool. The nice thing about the tool is it creates indexes on the dump so it is not all in memory at once.

Imposing answered 2/12, 2009 at 21:58 Comment(2)
This worked. I actually tried it before with some smaller heap dumps and it really didn't give me any helpful info, but once i got it in with the HeapDumpOnOutOfMemoryError hprof it finally pointed out the exact problem.Miche
Unfortunately MAT still used quite a lot of RAM for me ["failure java heap space"], see #7254517Astute
W
23

Use the equivalent of jhat -J-d64 -J-mx16g myheap.hprof as a command to launch jhat, i.e., this will start jhat in 64-bit mode with a maximum heap size of 16 gigabytes.

If the JVM on your platform defaults to 64-bit-mode operation, then the -J-d64 option should be unnecessary.

Worley answered 3/12, 2010 at 0:51 Comment(0)
I
21

I would take a look at the eclipse memory analyzer. This tool is great, and I have looked at several Gig heaps w/ this tool. The nice thing about the tool is it creates indexes on the dump so it is not all in memory at once.

Imposing answered 2/12, 2009 at 21:58 Comment(2)
This worked. I actually tried it before with some smaller heap dumps and it really didn't give me any helpful info, but once i got it in with the HeapDumpOnOutOfMemoryError hprof it finally pointed out the exact problem.Miche
Unfortunately MAT still used quite a lot of RAM for me ["failure java heap space"], see #7254517Astute
S
5

I had to load a 11 GB hprof file and couldn't with eclipse memory analyzer. What I ended up doing was to write a program to reduce the size of the hprof file by randomly removing instance information. Once I got the size of the hprof file down to 1GB, I could open it with eclipse memory analyzer and get a clue on what was causing the memory leak.

Scarito answered 15/9, 2011 at 15:13 Comment(1)
Care to share the tool or details how how to write a similar one?Latin
S
-2

What flags are you passing to jhat? Make sure that you're in 64-bit mode and you're setting the heap size large enough.

Shop answered 2/12, 2009 at 21:1 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.