When I run my code, Node.js throws a "RangeError: Maximum call stack size exceeded"
exception caused by too many recursive calls. I tried to increase Node.js stack size by sudo node --stack-size=16000 app
, but Node.js crashes without any error message. When I run this again without sudo, then Node.js prints 'Segmentation fault: 11'
. Is there a possibility to solve this without removing my recursive calls?
You should wrap your recursive function call into a
setTimeout
,setImmediate
orprocess.nextTick
function to give node.js the chance to clear the stack. If you don't do that and there are many loops without any real async function call or if you do not wait for the callback, your RangeError: Maximum call stack size exceeded
will be inevitable.
There are many articles concerning "Potential Async Loop". Here is one.
Now some more example code:
// ANTI-PATTERN
// THIS WILL CRASH
var condition = false, // potential means "maybe never"
max = 1000000;
function potAsyncLoop( i, resume ) {
if( i < max ) {
if( condition ) {
someAsyncFunc( function( err, result ) {
potAsyncLoop( i+1, callback );
});
} else {
// this will crash after some rounds with
// "stack exceed", because control is never given back
// to the browser
// -> no GC and browser "dead" ... "VERY BAD"
potAsyncLoop( i+1, resume );
}
} else {
resume();
}
}
potAsyncLoop( 0, function() {
// code after the loop
...
});
This is right:
var condition = false, // potential means "maybe never"
max = 1000000;
function potAsyncLoop( i, resume ) {
if( i < max ) {
if( condition ) {
someAsyncFunc( function( err, result ) {
potAsyncLoop( i+1, callback );
});
} else {
// Now the browser gets the chance to clear the stack
// after every round by getting the control back.
// Afterwards the loop continues
setTimeout( function() {
potAsyncLoop( i+1, resume );
}, 0 );
}
} else {
resume();
}
}
potAsyncLoop( 0, function() {
// code after the loop
...
});
Now your loop may become too slow, because we loose a little time (one browser roundtrip) per round. But you do not have to call setTimeout
in every round. Normally it is o.k. to do it every 1000th time. But this may differ depending on your stack size:
var condition = false, // potential means "maybe never"
max = 1000000;
function potAsyncLoop( i, resume ) {
if( i < max ) {
if( condition ) {
someAsyncFunc( function( err, result ) {
potAsyncLoop( i+1, callback );
});
} else {
if( i % 1000 === 0 ) {
setTimeout( function() {
potAsyncLoop( i+1, resume );
}, 0 );
} else {
potAsyncLoop( i+1, resume );
}
}
} else {
resume();
}
}
potAsyncLoop( 0, function() {
// code after the loop
...
});
I found a dirty solution:
/bin/bash -c "ulimit -s 65500; exec /usr/local/bin/node --stack-size=65500 /path/to/app.js"
It just increase call stack limit. I think that this is not suitable for production code, but I needed it for script that run only once.
In some languages this can be solved with tail call optimization, where the recursion call is transformed under the hood into a loop so no maximum stack size reached error exists.
But in javascript the current engines don't support this, it's foreseen for new version of the language Ecmascript 6.
Node.js has some flags to enable ES6 features but tail call is not yet available.
So you can refactor your code to implement a technique called trampolining, or refactor in order to transform recursion into a loop.
I had a similar issue as this. I had an issue with using multiple Array.map()'s in a row (around 8 maps at once) and was getting a maximum_call_stack_exceeded error. I solved this by changing the map's into 'for' loops
So if you are using alot of map calls, changing them to for loops may fix the problem
Edit
Just for clarity and probably-not-needed-but-good-to-know-info, using .map()
causes the array to be prepped (resolving getters , etc) and the callback to be cached, and also internally keeps an index of the array (so the callback is provided with the correct index/value). This stacks with each nested call, and caution is advised when not nested as well, as the next .map()
could be called before the first array is garbage collected (if at all).
Take this example:
var cb = *some callback function*
var arr1 , arr2 , arr3 = [*some large data set]
arr1.map(v => {
*do something
})
cb(arr1)
arr2.map(v => {
*do something // even though v is overwritten, and the first array
// has been passed through, it is still in memory
// because of the cached calls to the callback function
})
If we change this to:
for(var|let|const v in|of arr1) {
*do something
}
cb(arr1)
for(var|let|const v in|of arr2) {
*do something // Here there is not callback function to
// store a reference for, and the array has
// already been passed of (gone out of scope)
// so the garbage collector has an opportunity
// to remove the array if it runs low on memory
}
I hope this makes some sense (I don't have the best way with words) and helps a few to prevent the head scratching I went through
If anyone is interested, here is also a performance test comparing map and for loops (not my work).
https://github.com/dg92/Performance-Analysis-JS
For loops are usually better than map, but not reduce, filter, or find
Pre:
for me the program with the Max call stack wasn't because of my code. It ended up being a different issue which caused the congestion in the flow of the application. So because I was trying to add too many items to mongoDB without any configuration chances the call stack issue was popping and it took me a few days to figure out what was going on....that said:
Following up with what @Jeff Lowery answered: I enjoyed this answer so much and it sped up the process of what I was doing by 10x at least.
I'm new at programming but I attempted to modularize the answer it. Also, didn't like the error being thrown so I wrapped it in a do while loop instead. If anything I did is incorrect, please feel free to correct me.
module.exports = function(object) {
const { max = 1000000000n, fn } = object;
let counter = 0;
let running = true;
Error.stackTraceLimit = 100;
const A = (fn) => {
fn();
flipper = B;
};
const B = (fn) => {
fn();
flipper = A;
};
let flipper = B;
const then = process.hrtime.bigint();
do {
counter++;
if (counter > max) {
const now = process.hrtime.bigint();
const nanos = now - then;
console.log({ 'runtime(sec)': Number(nanos) / 1000000000.0 });
running = false;
}
flipper(fn);
continue;
} while (running);
};
Check out this gist to see the my files and how to call the loop. https://gist.github.com/gngenius02/3c842e5f46d151f730b012037ecd596c
If you don't want to implement your own wrapper, you can use a queue system, e.g. async.queue, queue.
Regarding increasing the max stack size, on 32 bit and 64 bit machines V8's memory allocation defaults are, respectively, 700 MB and 1400 MB. In newer versions of V8, memory limits on 64 bit systems are no longer set by V8, theoretically indicating no limit. However, the OS (Operating System) on which Node is running can always limit the amount of memory V8 can take, so the true limit of any given process cannot be generally stated.
Though V8 makes available the --max_old_space_size
option, which allows control over the amount of memory available to a process, accepting a value in MB. Should you need to increase memory allocation, simply pass this option the desired value when spawning a Node process.
It is often an excellent strategy to reduce the available memory allocation for a given Node instance, especially when running many instances. As with stack limits, consider whether massive memory needs are better delegated to a dedicated storage layer, such as an in-memory database or similar.
I thought of another approach using function references that limits call stack size without using setTimeout()
(Node.js, v10.16.0):
testLoop.js
let counter = 0;
const max = 1000000000n // 'n' signifies BigInteger
Error.stackTraceLimit = 100;
const A = () => {
fp = B;
}
const B = () => {
fp = A;
}
let fp = B;
const then = process.hrtime.bigint();
for(;;) {
counter++;
if (counter > max) {
const now = process.hrtime.bigint();
const nanos = now - then;
console.log({ "runtime(sec)": Number(nanos) / (1000000000.0) })
throw Error('exit')
}
fp()
continue;
}
output:
$ node testLoop.js
{ 'runtime(sec)': 18.947094799 }
C:\Users\jlowe\Documents\Projects\clearStack\testLoop.js:25
throw Error('exit')
^
Error: exit
at Object.<anonymous> (C:\Users\jlowe\Documents\Projects\clearStack\testLoop.js:25:11)
at Module._compile (internal/modules/cjs/loader.js:776:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:787:10)
at Module.load (internal/modules/cjs/loader.js:653:32)
at tryModuleLoad (internal/modules/cjs/loader.js:593:12)
at Function.Module._load (internal/modules/cjs/loader.js:585:3)
at Function.Module.runMain (internal/modules/cjs/loader.js:829:12)
at startup (internal/bootstrap/node.js:283:19)
at bootstrapNodeJSCore (internal/bootstrap/node.js:622:3)
Please check that the function you are importing and the one that you have declared in the same file do not have the same name.
I will give you an example for this error. In express JS (using ES6), consider the following scenario:
import {getAllCall} from '../../services/calls';
let getAllCall = () => {
return getAllCall().then(res => {
//do something here
})
}
module.exports = {
getAllCall
}
The above scenario will cause infamous RangeError: Maximum call stack size exceeded error because the function keeps calling itself so many times that it runs out of maximum call stack.
Most of the times the error is in code (like the one above). Other way of resolving is manually increasing the call stack. Well, this works for certain extreme cases, but it is not recommended.
Hope my answer helped you.
Even though you upgrade maximum stack size while initializing node, it may give you error. So please ensure to delete existing node and start once again with new set stack size.
Also do ensure that same user is used while starting the node instances which we have used while upgrading node version.
try not using recursion call and try using while loop instead
You can use loop for.
var items = {1, 2, 3}
for(var i = 0; i < items.length; i++) {
if(i == items.length - 1) {
res.ok(i);
}
}
var items = {1, 2, 3}
is no valid JS syntax. how is this related to the question at all? –
Fortin [1,2,3]
–
Edgebone © 2022 - 2024 — McMap. All rights reserved.
Segmentation fault: 11
usually means a bug in node. – Collarbone