Stdout of Node.js child_process exec is cut short
Asked Answered
C

3

13

In Node.js I'm using the exec command of the child_process module to call an algorithm in Java that returns a large amount of text to standard out which I then parse and use. I'm able to capture it mostly, but when it exceeds a certain number of lines, the content is cutoff.

exec("sh target/bin/solver "+fields.dimx+" "+fields.dimy, function(error, stdout, stderr){
    //do stuff with stdout
}

I've tried using setTimeouts and callbacks but haven't succeeded but I do feel this is occurring because I'm referencing stdout in my code before it can be retrieved completely. I have tested that stdout is in-fact where the data loss first occurs. It's not an asynchronous issue further down the line. I've also tested this on my local machine and Heroku, and the exact same issue occurs, truncating at the exact same line number every time.

Any ideas or suggestions as to what might help with this?

Corporeity answered 17/1, 2014 at 14:54 Comment(1)
See Also: Stdout buffer issue using node child_processDoall
O
4

Edited: I have tried with dir /s on my computer (windows) and got the same problem( it look like a bug), this code solve that problem for me:

var exec = require('child_process').exec;

function my_exec(command, callback) {
    var proc = exec(command);

    var list = [];
    proc.stdout.setEncoding('utf8');

    proc.stdout.on('data', function (chunk) {
        list.push(chunk);
    });

    proc.stdout.on('end', function () {
        callback(list.join());
    });
}

my_exec('dir /s', function (stdout) {
    console.log(stdout);
})
Ornithine answered 17/1, 2014 at 15:1 Comment(2)
why do you think it looks like a bug? please have a look at my answerMetronymic
@Metronymic you can see the history of this answer, i have increased buffer size but it not work for me.Ornithine
M
10

I had exec.stdout.on('end') callbacks hung forever with @damphat solution.

Another solution is to increase the buffer size in the options of exec: see the documentation here

{ encoding: 'utf8',
  timeout: 0,
  maxBuffer: 200*1024, //increase here
  killSignal: 'SIGTERM',
  cwd: null,
  env: null }

To quote: maxBuffer specifies the largest amount of data allowed on stdout or stderr - if this value is exceeded then the child process is killed. I now use the following: this does not require handling the separated parts of the chunks separated by commas in stdout, as opposed to the accepted solution.

exec('dir /b /O-D ^2014*', {
    maxBuffer: 2000 * 1024 //quick fix
    }, function(error, stdout, stderr) {
        list_of_filenames = stdout.split('\r\n'); //adapt to your line ending char
        console.log("Found %s files in the replay folder", list_of_filenames.length)
    }
);
Metronymic answered 15/4, 2014 at 12:14 Comment(0)
T
9

The real (and best) solution to this problem is to use spawn instead of exec. As stated in this article, spawn is more suited for handling large volumes of data :

child_process.exec returns the whole buffer output from the child process. By default the buffer size is set at 200k. If the child process returns anything more than that, you program will crash with the error message "Error: maxBuffer exceeded". You can fix that problem by setting a bigger buffer size in the exec options. But you should not do it because exec is not meant for processes that return HUGE buffers to Node. You should use spawn for that. So what do you use exec for? Use it to run programs that return result statuses, instead of data.

spawn requires a different syntax than exec :

var proc = spawn('sh', ['target/bin/solver', 'fields.dimx', 'fields.dimy']);

proc.on("exit", function(exitCode) {
    console.log('process exited with code ' + exitCode);
});

proc.stdout.on("data", function(chunk) {
    console.log('received chunk ' + chunk);
});

proc.stdout.on("end", function() {
    console.log("finished collecting data chunks from stdout");
});
Troika answered 1/2, 2015 at 7:16 Comment(1)
Thanks for this answer! It helped me understanding that my exec was returning an error due to 1MB of output. Unfortunately the error object of exec was only returning the command name without detailing the reason for the error: {"cmd":"./read_mail.sh"}Henriettehenriha
O
4

Edited: I have tried with dir /s on my computer (windows) and got the same problem( it look like a bug), this code solve that problem for me:

var exec = require('child_process').exec;

function my_exec(command, callback) {
    var proc = exec(command);

    var list = [];
    proc.stdout.setEncoding('utf8');

    proc.stdout.on('data', function (chunk) {
        list.push(chunk);
    });

    proc.stdout.on('end', function () {
        callback(list.join());
    });
}

my_exec('dir /s', function (stdout) {
    console.log(stdout);
})
Ornithine answered 17/1, 2014 at 15:1 Comment(2)
why do you think it looks like a bug? please have a look at my answerMetronymic
@Metronymic you can see the history of this answer, i have increased buffer size but it not work for me.Ornithine

© 2022 - 2024 — McMap. All rights reserved.