Is it possible to catch out of memory exception in java? [duplicate]
Asked Answered
S

9

61

I'm developing a program that would require huge amount of memory, and I want to catch when out-of-memory exception happens. I had heard this is not possible to do, but curious if there is any development on this end.

Sendal answered 7/11, 2009 at 6:37 Comment(1)
Asking how to handle running out of memory is not quite the same as asking whether it is possible to catch an OutOfMemoryError. So this is not quite a duplicate of How to handle OutOfMemoryError in Java.Kancler
C
93

It's not an exception; it's an error: java.lang.OutOfMemoryError

You can catch it as it descends from Throwable:

try {
    // create lots of objects here and stash them somewhere
} catch (OutOfMemoryError E) {
    // release some (all) of the above objects
}

However, unless you're doing some rather specific stuff (allocating tons of things within a specific code section, for example) you likely won't be able to catch it as you won't know where it's going to be thrown from.

Corry answered 7/11, 2009 at 6:42 Comment(11)
Plus, there's likely no easy way for you to recover from it if you do catch it.Furious
@matt b - in the specific case where you're able to catch it you're presumably trying to control your memory consumption and thus would be able to release some / all of it. But generally speaking you're right, of course.Corry
@Corry (and others) - please read my answer to understand why trying to manage memory consumption by catching OOME's would be a BAD IDEA.Jonasjonathan
thank you @Corry ,but Stephen where there is no option so we use this try catchPulsar
@BaldbcsofIT - Your assertion that there is no alternative is incorrect. There is always an alternative: restart the JVM.Jonasjonathan
Stephen is correct, start your JVM with: -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=${DUMP_PATH} -XX:OnOutOfMemoryError="kill -9 %p" when it dies restart it.Ptah
As opposed to some other commenters, it does have a realistic use case when catching OOM error is the desired solution. Imagine a web service which fails to handle the large data the client just has sent. In this case, responding a HTTP error code to the client is fine. The request of the client won't be served but the server stays alive. Much better than a dead web service.Ics
@CsongorHalmai - If you have to catch and recover from an OOME, you will have first triggered a full GC which will (typically) take a long time. Sending requests that cause OOMEs could be an effective DoS attack ... whether it is done with deliberate intent to harm or by accident. Limiting requests sizes is a better solution.Jonasjonathan
@StephenC you not necessarily are able to filter the memory-intensive requests simply by the size of the request. I can easily imagine a small request that needs a lot of memory while another request in similar size needs just a small RAM for creating the response. Say, the request contains a URL if a jpg image which is processed by the server. Two jpgimages can have 50 character long URL while ...1.jpg is 10x10 pixels small while ...2.jpg has a lot more data in it. The second one may cause OOM exception but not necessarily will.Ics
So, in some cases the defensive code needs to be inside the application's request processing logic. But the same strategy applies. If certain requests / request types are able to trigger OOMEs at will, change things so that the request is stopped before the OOM condition happens ... or else your server is vulnerable to DoSing.Jonasjonathan
@Corry What if I want to catch it just to send a metric to the monitor system and then rethrew it?Hectic
D
63

It's possible:

try {
   // tragic logic created OOME, but we can blame it on lack of memory
} catch(OutOfMemoryError e) {
   // but what will you do here :)
} finally {
   // get ready to be fired by your boss
}
Diligence answered 7/11, 2009 at 6:49 Comment(3)
There is at least one reasonable thing which you could be doing which causes an OOME and is recoverable: Loading a very large image. The only thing in the try block is a call to ImageIO.read(), and you're showing the user a dialog telling them that the image is too large in the catch block. Just saying...Cicely
@Cicely That's a wrong approach, unless your app is single threaded. The thing is that even though your image-loading-thread catches OOME, but the other threads dont know about this and can get an OOME tooDiligence
@SurajChandran: The OOME caused by attempting to load a very large image will be due to attempting to allocate a very large byte[] or int[]. You get an OOME from that because the allocation fails, not because you are actually out of memory---hence it won't cause a problem on other threads.Cicely
J
29

You can catch and attempt to recover from OutOfMemoryError (OOM) exceptions, BUT IT IS PROBABLY A BAD IDEA ... especially if your aim is for the application to "keep going".

There are a number of reasons for this:

  1. As others have pointed out, there are better ways to manage memory resources than explicitly freeing things; i.e. using SoftReference and WeakReference for objects that could be freed if memory is short.

  2. If you wait until you actually run out of memory before freeing things, your application is likely to spend more time running the garbage collector. Depending on your JVM version and on your GC tuning parameters, the JVM can end up running the GC more and more frequently as it approaches the point at which will throw an OOM. The slowdown (in terms of the application doing useful work) can be significant. You probably want to avoid this.

  3. If the root cause of your problem is a memory leak, then the chances are that catching and recovering from the OOM will not reclaim the leaked memory. You application will keep going for a bit then OOM again, and again, and again at ever reducing intervals.

So my advice is NOT attempt to keep going from an OOM ... unless you know:

  • where and why the OOM happened,
  • that there won't have been any "collateral damage", and
  • that your recovery will release enough memory to continue.
Jonasjonathan answered 7/11, 2009 at 8:26 Comment(2)
What if I want to catch it just to send a metric to the monitor system and then rethrew it?Hectic
That may be OK. It depends. The concern is that the infrastructure (in the JVM) for talking to the monitoring system might itself have been damaged by the out of memory state. So even sending the metric could lock up things.Jonasjonathan
A
14

just throwing this out there for all those who ponder why someone might be running out of memory: i'm working on a project that runs out of memory frequently and i have had to implement a solution for this.

the project is a component of a forensics and investigation app. after collecting data in the field (using very low memory footprint, btw) data is opened in our investigation app. one of the features is to perform a CFG traversal of any arbitrary binary image that was captured in the field (applications from physical memory). these traversals can take a long time, but produce very helpful visual representations of the binary that was traversed.

to speed the traversal process, we try to keep as much data in physical memory as possible, but the data structures grow as the binary grows and we cannot keep it ALL in memory (the goal is to use a java heap less than 256m). so what do i do?

i created disk-backed versions of LinkedLists, Hashtables, etc. these are drop-in replacements for their counterparts and implement all the same interfaces so they look identical from the outside world.

the difference? these replacement structures cooperate with each other, catching out of memory errors and requesting that the least recently used elements from the least recently used collection be freed from memory. freeing the element dumps it to disk in temporary file (in the system provided temp directory) and marks a placeholder objects as "paged-out" in the proper collection.

there are PLENTY of reasons you might run out of memory in a java app - the root of most of these reasons is either one or both of: 1. App runs on a resource constrained machine (or attempts to limit resource usage by limiting heap size) 2. App simply requires large amounts of memory (image editing was suggested, but how about audio and video? what about compilers like in my case? how about long-term data collectors without non-volatile storage?)

-bit

Anaglyph answered 7/11, 2009 at 6:37 Comment(1)
This sort of logic sounds like a perfect usecase for the 'Reference' classes like WeakReference https://mcmap.net/q/130526/-is-it-possible-to-catch-out-of-memory-exception-in-java-duplicateCupule
H
8

It is possible to catch an OutOfMemoryError (It's an Error, not an Exception), but you should be aware, that there is no way to get a defined behaviour.
You may even get another OutOfMemoryError while trying to catch it.

So the better way is to create/use memory aware Caches. There are some frameworks out there (example: JCS), but you can easily build your own by using SoftReference. There is a small article about how to use it here. Follow the links in the article to get more informations.

Heterophyllous answered 17/6, 2010 at 15:22 Comment(1)
"there is no way to get a defined behaviour": because OutOfMemoryError can be thrown anywhere, including in places that could leave your program in an inconsistent state. See #8729366Kancler
S
5

It is possible, but if you run out of heap its not very useful. If there are resources which can be freed you better off using SoftReference or WeakReference to such resources and their clean-up will be automatic.

I have found it useful if you run out of direct memory before this doesn't trigger a GC automatically for some reason. So I have had cause to force a gc if I fail to allocate a direct buffer.

Scarf answered 7/11, 2009 at 7:57 Comment(1)
This answer addresses the primary use case that would drive wanting to do this. The answer should be much higher up.Dhow
E
5

There is probably at least one good time to catch an OutOfMemoryError, when you are specifically allocating something that might be way too big:

public static int[] decode(InputStream in, int len) throws IOException {
  int result[];
  try {
    result = new int[len];
  } catch (OutOfMemoryError e) {
    throw new IOException("Result too long to read into memory: " + len);
  } catch (NegativeArraySizeException e) {
    throw new IOException("Cannot read negative length: " + len);
  }
  ...
}
Eyespot answered 17/6, 2010 at 15:28 Comment(2)
Does this code work reliably? I have exactly this case, but I really dont want the app to crash. I guess the alternative is if(len>someConservativeSize) throw new IOException("too long");Auberta
This approach works well in C++, but I suspect not so well in Java, because the connection between new and OutOfMemoryError is indirect. If the allocation only just fits into memory, there could be a mysterious OutOfMemoryError some time later.Kancler
B
2

It is possible to catch Any exception. Just write

try{
   // code which you think might throw exception
}catch(java.lang.Throwable t){
   // you got the exception. Now what??
}

Ideally you are not supposed to catch java.lang.Error exceptions. Not catching such exceptions, and letting the application to terminate might be the best solution when they occur. If you think that you can very well handle such Error's, then go ahead.

Bathymetry answered 7/11, 2009 at 6:57 Comment(0)
S
1

Sure, catching OutOfMemoryError is allowed. Make sure you have a plan for what to do when it happens. You will need to free up some memory (by dropping references to objects) before allocating any more objects, or you will just run out of memory again. Sometimes the mere act of unwinding the stack a few frames will do that for you, some times you need to do something more explicit.

Sino answered 7/11, 2009 at 6:45 Comment(1)
Your plan will have to include a strategy for dealing with the possibility that your program is in an inconsistent state; see #8729366Kancler

© 2022 - 2024 — McMap. All rights reserved.