Doing a small check, it looks like neither V8 nor spidermonkey unroll loops, even if it is completely obvious, how long they are (literal as condition, declared locally):
const f = () => {
let counter = 0;
for (let i = 0; i < 100_000_000; i++) {
counter++;
}
return counter;
};
const g = () => {
let counter = 0;
for (let i = 0; i < 10_000_000; i += 10) {
counter++;
counter++;
counter++;
counter++;
counter++;
counter++;
counter++;
counter++;
counter++;
counter++;
}
return counter;
}
let start = performance.now();
f();
let mid = performance.now();
g();
let end = performance.now();
console.log(
`f took ${(mid - start).toFixed(2)}ms, g took ${(end - mid).toFixed(2)}ms, ` +
`g was ${((mid - start)/(end - mid)).toFixed(2)} times faster.`
);
Is there any reason for this? They perform considerably more complex optimizations. Are standard for
-loops just that uncommon in javascript, that it's not worth it?
Edit: Just as a note: one might argue, that perhaps optimization is delayed. This doesn't seem to be the case, although i am not an expert here. I used node --allow-natives-syntax --trace-deopt
, performed optimization manually, and observed no deoptimization happening (snippet for collapsing, not actually runnable in a browser):
const { performance } = require('perf_hooks');
const f = () => {
let counter = 0;
for (let i = 0; i < 100_000_000; i++) {
counter++;
}
return counter;
};
// collect metadata and optimize
f(); f();
%OptimizeFunctionOnNextCall(f);
f();
const start = performance.now();
f();
console.log(performance.now() - start);
Done with both the normal and unrolled version, the same effect prevails.