Process Builder waitFor() issue and Open file limitations
Asked Answered
F

4

9

I have inherited some code:

Process p = new ProcessBuilder("/bin/chmod", "777", path).start();
p.waitFor();

Basically, there is for some ancient and highly voodoo based reason for storing key/value pairs on disk as files. I don't really want to go into it.

However, I am left with a bunch of IO exceptions:

Exception :Cannot run program "/bin/chmod": java.io.IOException: error=24, Too many open files
Message: Cannot run program "/bin/chmod": java.io.IOException: error=24, Too many open files

And by a bunch I mean in the realms of 10k - millions

I get the feeling the waitFor call was to stop these from occurring waiting for the process to complete it and exit back, however I think the chmod is returning a result before the file is actually closed. Does anyone know if that would be the cause of these exceptions?

My other inclination is that the opening and closing of thousands of files is not happening quickly enough on the java end and that there is something else going on, maybe something like that there is some form of file buffer that isn't getting cleared out when fw.close() is being called.

I am pretty new to java and this was a hell weird one that has me stumped. (gladly the app still runs somehow.. after spitting out a very large log file that is)

Can anyone else think of a way to get around this, clearing buffers or increasing the files open limit to something where the jvm can keep up with itself (assuming that is the problem)

Faultless answered 15/7, 2009 at 4:37 Comment(2)
What is your target OS (and version). See this: unix.derkeiler.com/Newsgroups/comp.unix.solaris/2007-02/…Idiom
debian, it seems to be cleared out of uname. will be latest stable.Faultless
H
15

I presume you are running these chmod commands in a loop - otherwise I don't see why you'd get so many exceptions. It's possible that you're hitting a deadlock because you're not reading the output of the spawned processes. That certainly used to bite me back in the pre-ProcessBuilder, Runtime.exec() days.

Change your code snippet to the above pattern:

try {
    ProcessBuilder pb = new ProcessBuilder("/bin/chmod", "777", path);    
    pb.redirectErrorStream(true); // merge stdout, stderr of process

    Process p = pb.start();
    InputStreamReader isr = new  InputStreamReader(p.getInputStream());
    BufferedReader br = new BufferedReader(isr);

    String lineRead;
    while ((lineRead = br.readLine()) != null) {
        // swallow the line, or print it out - System.out.println(lineRead);
    }

    int rc = p.waitFor();
    // TODO error handling for non-zero rc
}
catch (IOException e) {
    e.printStackTrace(); // or log it, or otherwise handle it
}
catch (InterruptedException ie) {
    ie.printStackTrace(); // or log it, or otherwise handle it
} 

(credit: this site) and see if that helps the situation.

Hangup answered 15/7, 2009 at 11:7 Comment(3)
Tried this, same exceptions are occurringFaultless
I think I've solved it check my answer in a few mins - just waiting on tests to verifyFaultless
Ok picked yours as the answer as it needed this in the solution, check my post will include the extra lines needed.Faultless
F
8

Thanks for the help guys, this should sort out a load of weirdness going on elsewhere because of it.

Using your(Vinay) example and the stream closings:

try{ 
  fw.close();

  ProcessBuilder pb = new ProcessBuilder("/bin/chmod", "777", path);

  pb.redirectErrorStream(true); // merge stdout, stderr of process
  p = pb.start();

  InputStreamReader isr = new  InputStreamReader(p.getInputStream());
  BufferedReader br = new BufferedReader(isr);

  String lineRead;
  while ((lineRead = br.readLine()) != null) {
    // swallow the line, or print it out - System.out.println(lineRead);
  }

} catch (Exception ioe) {
  Logger.logException(Logger.WARN, ioe.getMessage(), ioe);
} finally {
  try {
    p.waitFor();//here as there is some snipped code that was causing a different
                // exception which stopped it from getting processed

    //missing these was causing the mass amounts of open 'files'
    p.getInputStream().close();
    p.getOutputStream().close();
    p.getErrorStream().close(); 

  } catch (Exception ioe) {
    Logger.logException(Logger.WARN, ioe.getMessage(), ioe);
  }
}

Got the idea from John B Mathews post.

Faultless answered 15/7, 2009 at 23:39 Comment(4)
note: still doesn't make sense why having the waitFor and closing the input streams wouldn't do just fine but I guess it is java...Faultless
Good catch, jim, but I still see a problem in your finally. I think you need to have each of the close calls in their own catch, otherwise if an exception occurs when doing p.getInputStream.close(), you'll fail to close the others. The problem may appear to have gone away now, but could come back later.Hangup
Do you really need to close both the input and error streams even though they're merged with redirectErrorStream? And do you really need to close the output stream even though you never used it?Unrest
You would need to test both and find out. This post is really old I don't thing the code is even live anymore.Faultless
I
0

It seems unlikely that the process would actually complete without closing the files. Could this be happening in a very large # of threads? Or perhaps some of them are not actually completing (ie, it is hanging at waitFor in some cases)?

Otherwise, I think you will be stuck with increasing the open files limit. Assuming that this is a Unix-like system, the "ulimit" command is probably what you are looking for.

Iata answered 15/7, 2009 at 5:11 Comment(0)
I
0

If you're using JAVA 6, you could also try the new setters (for read,write,execute) on the File object. Might be slower, but it should work.

Idiom answered 15/7, 2009 at 6:18 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.