Error java.lang.OutOfMemoryError: GC overhead limit exceeded
Asked Answered
H

24

961

I get this error message as I execute my JUnit tests:

java.lang.OutOfMemoryError: GC overhead limit exceeded

I know what an OutOfMemoryError is, but what does GC overhead limit mean? How can I solve this?

Hightail answered 8/9, 2009 at 11:34 Comment(11)
This sounds very interesting. I'd love if someone could post some code that generates this.Sayre
I simply found the problem, that lead to too much memory-usage, near to the limit of the heap. A simple solution could be simply to give some more Heap-memory to the Java-Engine (-Xmx) but this only helps, if the application needs exactly as much memory, as the heap-limit before was set.Hightail
also check out xmlandmore.blogspot.com/2011/05/…Anabasis
@Hightail i had given an answer here check whether it helps #11092016Hamby
@SimonKuang Note that there are multiple OutOfMemoryError scenarios for which increasing the heap isn't a valid solution: running out of native threads and running out of perm gen (which is separate from heap) are two examples. Be careful about making overly broad statements about OutOfMemoryErrors; there's an unexpectedly diverse set of things that can cause them.Cotta
How did you solve the issue??Cohberg
This error happened and still happening for me with Jdk1.8.0_91Meraree
@Thorsten Niehues: Well, I solved it basically by using less memory. As the top answer shows, this is a situation with very few remaining memory, while creating a lot of temporary objects. Basically there is no way around it, you have to reduce memory usage.Hightail
@Sayre I just had this error on Hadoop 3.2.0 running a local (non-cluster) job of sorting 28GB of data. It was working for several hours and exploded with this error just before the end of job. Normally this dataset with this map-reduce task completes okay.Chekhov
For future visitors - Any answer which simply tells you to increase the heap size (search for javaMaxHeapSize or Xmx) might solve your problem, if you are working with that amount of data. But you really need to look at your code to limit data usage. Try sampling a smaller amount of data or limiting the records you process. If you have no other option, run your code on a machine in the cloud which can provide you with as much memory as you want.Directrix
@Directrix Machines in the cloud are still physical computers somewhere. Each type you can rent has a certain amount of RAM. Another consideration is that larger machines will cost way more money than smaller ones.Homology
H
869

This message means that for some reason the garbage collector is taking an excessive amount of time (by default 98% of all CPU time of the process) and recovers very little memory in each run (by default 2% of the heap).

This effectively means that your program stops doing any progress and is busy running only the garbage collection at all time.

To prevent your application from soaking up CPU time without getting anything done, the JVM throws this Error so that you have a chance of diagnosing the problem.

The rare cases where I've seen this happen is where some code was creating tons of temporary objects and tons of weakly-referenced objects in an already very memory-constrained environment.

Check out the Java GC tuning guide, which is available for various Java versions and contains sections about this specific problem:

Haemostatic answered 8/9, 2009 at 11:39 Comment(14)
Would it be correct to summarise your answer as follows: "It's just like an 'Out of Java Heap space' error. Give it more memory with -Xmx." ?Initiate
@Tim: No, that wouldn't be correct. While giving it more memory could reduce the problem, you should also look at your code and see why it produces that amount of garbage and why your code skims just below the "out of memory" mark. It's often a sign of broken code.Haemostatic
Thanks, it seems Oracle isn't actually that good in data migration, they broke the link.Haemostatic
I am investigating the same problem in a application running on Weblogic. It runs on a server shared with other application running on Weblogic. Does the error mean it always has to do with my application therefore can rule out problems in the other applications on the same server? Or is there a possibility other applications can interfere with your environment. Just asking cause it's hard to find the cause of memory leaks.Marybethmaryellen
@Guus: if multiple applications run in the same JVM, then yes, they can easily influence each other. It'll be hard to tell which one is misbehaving. Separating the applications into distinct JVMs might be the easiest solution.Haemostatic
@Joachim: server is located at a client. I checked with them and the applications do run in separate JVMs. I stress tested the app on our server and cannot get it to go out of memory. Could another process (java or non java) on the server somehow be the cause of my app to go out of memory?Marybethmaryellen
@Guus: no, especially not with the error message discussed here. It's more likely to be an artifact of configuration and/or specific loads that trigger the problem. But you really ought to ask this in a separate question (with as much detail as possible), it's getting too much for the comments here.Haemostatic
@TimCooper - that's honestly a poor answer even for the Out of Java Heap space error, though it's certainly sometimes necessary. To trigger this Error, however, you really have to be beating up the JVM, it's quite good at efficiently collecting garbage. If you're seeing this error, it is far more likely you're doing something violently cruel to the JVM than it is that you're simply overloading the heap.Euphoria
Is this specific to Java 6? Does the same issue happen in Java 7?Hydrophobic
@TimCooper: Just giving more memory is often quite.. blunt tool for resolving issues like this. It's often more useful to look first at if you create lot of new objects, but also if you memory is properly split. Often the problem is that one of the three areas is at upper limit, but the others have plenty of free space. Then re-partitioning the JVM memory pools would help.Ullman
I'd just had this happen to me with Java 7 and a web application containing 2001670 lines of Java code, of which I wrote about 5. "You should also look at your code" is not so easy in such cases.Tully
Looking for a help to my issue, I've found this #110583. The GC would be affected whether create an object inside or out a loop?Caboodle
Today I study different GCs for the same code. SerialGC does not have this problem, but ParallelGC does. I don't know the reason yet.Shoemaker
@TimCooper, I had Out of Java Heap error at first , then I increased it's memory, now I'm getting GC overhead error! :DJohny
I
251

Quoting from Oracle's article "Java SE 6 HotSpot[tm] Virtual Machine Garbage Collection Tuning":

Excessive GC Time and OutOfMemoryError

The parallel collector will throw an OutOfMemoryError if too much time is being spent in garbage collection: if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered, an OutOfMemoryError will be thrown. This feature is designed to prevent applications from running for an extended period of time while making little or no progress because the heap is too small. If necessary, this feature can be disabled by adding the option -XX:-UseGCOverheadLimit to the command line.

EDIT: looks like someone can type faster than me :)

Idellaidelle answered 8/9, 2009 at 11:43 Comment(8)
"You can turn this off..." but the OP most likely should not do this.Weatherley
Can you tell me the difference between "-XX" and "-Xmx"? I was able to turn it off using the "-Xmx" option too.Danyelldanyelle
Replying to a very old comment here, but... @Bart The -XX: at the start of several command line options is a flag of sorts indicating that this option is highly VM-specific and unstable (subject to change without notice in future versions). In any case, the -XX:-UseGCOverheadLimit flag tells the VM to disable GC overhead limit checking (actually "turns it off"), whereas your -Xmx command merely increased the heap. In the latter case the GC overhead checking was still running, it just sounds like a bigger heap solved the GC thrashing issues in your case (this will not always help).Deficit
In my application (reading a large Excel file in Talend) this did not work and from other users explanation I understand why. This just disables the error but the problem persists and your application will just spend most of its time handling GC. Our server had plenty of RAM so I used the suggestions by Vitalii to increase the heap size.Lindsay
You will eventually get this error if your application is data intensive, clearing the memory and evading data leak is the best way out - but requires some time.Snook
Before trying out any of the above things I would suggest close the android studio and kill all Java/JVM related processes(or restart your system). One of the reasons for this error is way too many Java processes are running and GC is not able to run properly. Now open your android studio and try building it again if it still doesn't work you can increase the heap size as mentioned in earlier answers.Gainer
I ended up having to use this option for a maven build that ate up around 4G of memory. I tried increasing the heap size with -Xmx8192M, but this flag is the only thing that worked.Tactical
My jenkins slave is using : java -Xmx50G -jar slave.jar still facing the issue. Any help here?Khichabia
N
109

If you are sure there are no memory leaks in your program, try to:

  1. Increase the heap size, for example -Xmx1g.
  2. Enable the concurrent low pause collector -XX:+UseConcMarkSweepGC.
  3. Reuse existing objects when possible to save some memory.

If necessary, the limit check can be disabled by adding the option -XX:-UseGCOverheadLimit to the command line.

Nightclub answered 14/12, 2011 at 1:44 Comment(3)
I disagree with the third advice. Reuse existing objects do not save memory (do not leak old objects save memory :-) Moreover "reuse existing object" was a practice to relieve GC pressure. But it's NOT ALWAYS a good idea: with modern GC, we should avoid situations where old objects hold new ones because it can break some locality assumptions...Elwell
@mcoolive: For a somewhat contrived example, see the comments to answer https://mcmap.net/q/53308/-error-java-lang-outofmemoryerror-gc-overhead-limit-exceeded below; creating the List object inside the loop caused GC to be called 39 times instead of 22 times.Adequate
It's funny to call 1GB heap limit as a increase. Requirement was quite different previous decade.Aparicio
O
58

It's usually the code. Here's a simple example:

import java.util.*;

public class GarbageCollector {

    public static void main(String... args) {

        System.out.printf("Testing...%n");
        List<Double> list = new ArrayList<Double>();
        for (int outer = 0; outer < 10000; outer++) {

            // list = new ArrayList<Double>(10000); // BAD
            // list = new ArrayList<Double>(); // WORSE
            list.clear(); // BETTER

            for (int inner = 0; inner < 10000; inner++) {
                list.add(Math.random());
            }

            if (outer % 1000 == 0) {
                System.out.printf("Outer loop at %d%n", outer);
            }

        }
        System.out.printf("Done.%n");
    }
}

Using Java 1.6.0_24-b07 on a Windows 7 32 bit.

java -Xloggc:gc.log GarbageCollector

Then look at gc.log

  • Triggered 444 times using BAD method
  • Triggered 666 times using WORSE method
  • Triggered 354 times using BETTER method

Now granted, this is not the best test or the best design but when faced with a situation where you have no choice but implementing such a loop or when dealing with existing code that behaves badly, choosing to reuse objects instead of creating new ones can reduce the number of times the garbage collector gets in the way...

Overlook answered 12/4, 2011 at 19:27 Comment(3)
Please clarify: When you say "Triggered n times", does that mean that a regular GC happened n times, or that the "GC overhead limit exceeded" error reported by the OP happened n times?Condemnation
I tested just now using java 1.8.0_91 and never got an error/exception, and the "Triggered n times" was from counting up the number of lines in the gc.log file. My tests show much fewer times overall, but fewest "Triggers" times for BETTER, and now, BAD is "badder" than WORST now. My counts: BAD: 26, WORSE: 22, BETTER 21.Adequate
I just added a "WORST_YET" modification where I define the List<Double> list in the outer loop instead of before the outer loop, and Triggered 39 garbage collections.Adequate
K
39

Cause for the error according to the Java [8] Platform, Standard Edition Troubleshooting Guide: (emphasis and line breaks added)

[...] "GC overhead limit exceeded" indicates that the garbage collector is running all the time and Java program is making very slow progress.

After a garbage collection, if the Java process is spending more than approximately 98% of its time doing garbage collection and if it is recovering less than 2% of the heap and has been doing so far the last 5 (compile time constant) consecutive garbage collections, then a java.lang.OutOfMemoryError is thrown. [...]

  1. Increase the heap size if current heap is not enough.
  2. If you still get this error after increasing heap memory, use memory profiling tools like MAT ( Memory analyzer tool), Visual VM etc and fix memory leaks.
  3. Upgrade JDK version to latest version ( 1.8.x) or at least 1.7.x and use G1GC algorithm. . The throughput goal for the G1 GC is 90 percent application time and 10 percent garbage collection time
  4. Apart from setting heap memory with -Xms1g -Xmx2g , try

    -XX:+UseG1GC -XX:G1HeapRegionSize=n -XX:MaxGCPauseMillis=m  
    -XX:ParallelGCThreads=n -XX:ConcGCThreads=n
    

Have a look at some more related questions regarding G1GC

Kura answered 6/2, 2016 at 18:6 Comment(0)
L
33

Just increase the heap size a little by setting this option in

Run → Run Configurations → Arguments → VM arguments

-Xms1024M -Xmx2048M

Xms - for minimum limit

Xmx - for maximum limit

Lights answered 13/5, 2014 at 10:40 Comment(5)
The android apps dont have arguments tab...what should we do to achieve this?Spark
What tool is that answer for? That was not an Eclipse question.Fennie
There is no "minimum limit". -Xms is the initial size.Mace
What is the max of maximum limit that could be set??Merth
@Merth the max is as much as the physical memory of your machine. However, other applications will compete over memory use if you try that.Dysgenic
M
16

try this

open the build.gradle file

  android {
        dexOptions {
           javaMaxHeapSize = "4g"
        }
   }
Melodramatic answered 23/11, 2016 at 18:9 Comment(1)
Works great for the simulator. Any idea how this affects real devices? i.e. is this a good idea or is it just masking the issue? Thanks.Christ
C
14

For me, the following steps worked:

  1. Open the eclipse.ini file
  2. Change

    -Xms40m
    -Xmx512m
    

    to

    -Xms512m
    -Xmx1024m
    
  3. Restart Eclipse

See here

Clamp answered 31/12, 2014 at 5:9 Comment(6)
the most simple way to fix this problem. Thanks :)Comedy
eclipse.ini file in jdev?Phonetics
problems unsolved even when the configuration has been changed to this.Osculum
problem not yet solved. do you have / know any other way?Kliman
The OP did not ask an Eclipse question.Fennie
This "answer" does not answer the question above.Foreman
T
13

The following worked for me. Just add the following snippet:

android {
        compileSdkVersion 25
        buildToolsVersion '25.0.1'

defaultConfig {
        applicationId "yourpackage"
        minSdkVersion 10
        targetSdkVersion 25
        versionCode 1
        versionName "1.0"
        multiDexEnabled true
    }
dexOptions {
        javaMaxHeapSize "4g"
    }
}
Tinware answered 27/12, 2016 at 11:47 Comment(2)
Yes, when using Gradle :)Hibernia
How could you even think this is a solution to his question in general? You set your heap size to 4g which is totally arbitrary in a gradle configuration for Android facepalm.Scuff
F
9

increase javaMaxHeapsize in your build.gradle(Module:app) file

dexOptions {
    javaMaxHeapSize "1g"
}

to (Add this line in gradle)

 dexOptions {
        javaMaxHeapSize "4g"
    }
Forwards answered 7/4, 2017 at 12:27 Comment(0)
G
8

Solved:
Just add
org.gradle.jvmargs=-Xmx1024m
in
gradle.properties
and if it does not exist, create it.

Grotesque answered 22/9, 2019 at 9:53 Comment(0)
S
6

You can also increase memory allocation and heap size by adding this to your gradle.properties file:

org.gradle.jvmargs=-Xmx2048M -XX\:MaxHeapSize\=32g

It doesn't have to be 2048M and 32g, make it as big as you want.

Sweatbox answered 23/7, 2019 at 21:3 Comment(0)
K
5

Java heap size descriptions (xms, xmx, xmn)

-Xms size in bytes

Example : java -Xms32m

Sets the initial size of the Java heap. The default size is 2097152 (2MB). The values must be a multiple of, and greater than, 1024 bytes (1KB). (The -server flag increases the default size to 32M.)

-Xmn size in bytes

Example : java -Xmx2m

Sets the initial Java heap size for the Eden generation. The default value is 640K. (The -server flag increases the default size to 2M.)

-Xmx size in bytes

Example : java -Xmx2048m

Sets the maximum size to which the Java heap can grow. The default size is 64M. (The -server flag increases the default size to 128M.) The maximum heap limit is about 2 GB (2048MB).

Java memory arguments (xms, xmx, xmn) formatting

When setting the Java heap size, you should specify your memory argument using one of the letters “m” or “M” for MB, or “g” or “G” for GB. Your setting won’t work if you specify “MB” or “GB.” Valid arguments look like this:

-Xms64m or -Xms64M -Xmx1g or -Xmx1G Can also use 2048MB to specify 2GB Also, make sure you just use whole numbers when specifying your arguments. Using -Xmx512m is a valid option, but -Xmx0.5g will cause an error.

This reference can be helpful for someone.

Kliman answered 10/10, 2019 at 9:9 Comment(0)
S
2

To increase heap size in IntelliJ IDEA follow the following instructions. It worked for me.

For Windows Users,

Go to the location where IDE is installed and search for following.

idea64.exe.vmoptions

Edit the file and add the following.

-Xms512m
-Xmx2024m
-XX:MaxPermSize=700m
-XX:ReservedCodeCacheSize=480m

That is it !!

Sikang answered 14/6, 2018 at 13:16 Comment(0)
L
1

I'm working in Android Studio and encountered this error when trying to generate a signed APK for release. I was able to build and test a debug APK with no problem, but as soon as I wanted to build a release APK, the build process would run for minutes on end and then finally terminate with the "Error java.lang.OutOfMemoryError: GC overhead limit exceeded". I increased the heap sizes for both the VM and the Android DEX compiler, but the problem persisted. Finally, after many hours and mugs of coffee it turned out that the problem was in my app-level 'build.gradle' file - I had the 'minifyEnabled' parameter for the release build type set to 'false', consequently running Proguard stuffs on code that hasn't been through the code-shrinking' process (see https://developer.android.com/studio/build/shrink-code.html). I changed the 'minifyEnabled' parameter to 'true' and the release build executed like a dream :)

In short, I had to change my app-level 'build.gradle' file from: //...

buildTypes {
    release {
        minifyEnabled false
        proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
        signingConfig signingConfigs.sign_config_release
    }
    debug {
        debuggable true
        signingConfig signingConfigs.sign_config_debug
    }
}

//...

to

    //...

buildTypes {
    release {
        minifyEnabled true
        proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
        signingConfig signingConfigs.sign_config_release
    }
    debug {
        debuggable true
        signingConfig signingConfigs.sign_config_debug
    }
}

//...
Leaguer answered 7/1, 2018 at 22:33 Comment(0)
M
1

you can try to make changes on the server setting by referring to this image and increase the memory size for processing process changes highlighted in yellow

you can also make changes to java heap by opening cmd-> set _java_opts -Xmx2g
2g(2gigabytes) depending upon the complexity of your program

try to use less constant variable and temp variables

enter image description here

Malines answered 27/2, 2020 at 2:40 Comment(0)
S
1

I got this error while working with Oracle web logic server. I am sharing my answer for reference in case someone end up here looking for the solution.

So, if you are trying to up the Oracle web logic server and got this error then you just have to increase the initial and maximum heap size set for running the server.

Go to - > C:\Oracle\Middleware\Oracle_Home\user_projects\domains\wl_server\bin

open setDomainEnv.cmd

check set USER_MEM_ARGS value , if its less then

set USER_MEM_ARGS="-Xms128m – Xmx8192m ${MEM_DEV_ARGS} ${MEM_MAX_PERM_SIZE}"

This means that your intital heap size is set to 128 MB and max heap size is 8GB. Now , just save the file and restart the server. if it didn't resolve the issue, try increasing the size or look for ways to optimizing the service.

for ref , check this link : https://docs.oracle.com/cd/E49933_01/server.770/es_install/src/tins_postinstall_jvm_heap.html

edit: Check whether you are able to see the updated java args while running the server . just like this enter image description here If its coming as before then replace the shown value from setDoaminEnv.cmd by simple search and replace.

Swatter answered 6/11, 2021 at 18:35 Comment(0)
G
0

You need to increase the memory size in Jdeveloper go to setDomainEnv.cmd.

set WLS_HOME=%WL_HOME%\server    
set XMS_SUN_64BIT=**256**
set XMS_SUN_32BIT=**256**
set XMX_SUN_64BIT=**3072**
set XMX_SUN_32BIT=**3072**
set XMS_JROCKIT_64BIT=**256**
set XMS_JROCKIT_32BIT=**256**
set XMX_JROCKIT_64BIT=**1024**
set XMX_JROCKIT_32BIT=**1024**

if "%JAVA_VENDOR%"=="Sun" (
    set WLS_MEM_ARGS_64BIT=**-Xms256m -Xmx512m**
    set WLS_MEM_ARGS_32BIT=**-Xms256m -Xmx512m**
) else (
    set WLS_MEM_ARGS_64BIT=**-Xms512m -Xmx512m**
    set WLS_MEM_ARGS_32BIT=**-Xms512m -Xmx512m**
)

and

set MEM_PERM_SIZE_64BIT=-XX:PermSize=**256m**
set MEM_PERM_SIZE_32BIT=-XX:PermSize=**256m**

if "%JAVA_USE_64BIT%"=="true" (
    set MEM_PERM_SIZE=%MEM_PERM_SIZE_64BIT%
) else (
    set MEM_PERM_SIZE=%MEM_PERM_SIZE_32BIT%
)

set MEM_MAX_PERM_SIZE_64BIT=-XX:MaxPermSize=**1024m**
set MEM_MAX_PERM_SIZE_32BIT=-XX:MaxPermSize=**1024m**
Galvanic answered 13/4, 2016 at 9:52 Comment(1)
These settings are only specific to your local IDE. This will no work for Prod environment.Insured
L
0

In Netbeans, it may be helpful to design a max heap size. Go to Run => Set Project Configuration => Customise. In the Run of its popped up window, go to VM Option, fill in -Xms2048m -Xmx2048m. It could solve heap size problem.

Liatrice answered 12/6, 2017 at 22:45 Comment(0)
F
0

I don't know if this is still relevant or not, but just want to share what worked for me.

Update kotlin version to latest available. https://blog.jetbrains.com/kotlin/category/releases/

and it's done.

Fivestar answered 25/11, 2019 at 3:41 Comment(0)
S
0

@Buhb I reproduced this by this in an normal spring-boot web application within its main method. Here is the code:

public static void main(String[] args) {
    SpringApplication.run(DemoServiceBApplication.class, args);
    LOGGER.info("hello.");
    int len = 0, oldlen=0;
    Object[] a = new Object[0];
    try {
        for (; ; ) {
            ++len;
            Object[] temp = new Object[oldlen = len];
            temp[0] = a;
            a = temp;
        }
    } catch (Throwable e) {
        LOGGER.info("error: {}", e.toString());
    }
}

The sample code that caused an come is also from oracle java8 language specifications.

Sabin answered 11/9, 2021 at 5:17 Comment(0)
G
0

If you are here for the exception that rise in TOMCAT then this is for you:

It may happen when you work with that tomcat for the long time

Steps:

  1. Navigate to tomcat folder (C:/tomcat)
  2. There will a folder named - "work" and then - "catalina" then - "localHost"
  3. Delete that localhost folder
  4. Inside C:/tomcat/webapps there will be files, delete all and rebuild.

Now that heap memory issue would be sorted.

Gallipot answered 20/9, 2023 at 14:24 Comment(0)
S
0

For SAP BOBJ BOE SIA CMS 4.1, the xmx parameter is in the registry at location:

HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Apache Software Foundation\Procrun 2.0\SIAserverName\Parameters\Java

Options value, it was 256m, increased it to 2g. No more out of memory errors in the log file, SIA is now stable and CMS.exe starts up okay.

Screening answered 8/12, 2023 at 17:50 Comment(0)
C
-6

Rebooting my MacBook fixed this issue for me.

Chlorenchyma answered 26/2, 2019 at 17:23 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.