So I was trying to understand JavaScript's behavior when dealing with large numbers. Consider the following (tested in Firefox and Chrome):
console.log(9007199254740993) // 9007199254740992
console.log(9007199254740994) // 9007199254740994
console.log(9007199254740995) // 9007199254740996
console.log(9007199254740996) // 9007199254740996
console.log(9007199254740997) // 9007199254740996
console.log(9007199254740998) // 9007199254740998
console.log(9007199254740999) // 9007199254741000
Now, I'm aware of why it's outputting the 'wrong' numbers—it's trying to convert them to floating point representations and it's rounding off to the nearest possible floating point value—but I'm not entirely sure about why it picks these particular numbers. My guess is that it's trying to round to the nearest 'even' number, and since 9007199254740996 is divisible by 4 while 9007199254740994 is not, it considers 9007199254740996 to be more 'even'.
- What algorithm is it using to determine the internal representation? My guess is that it's an extension of regular midpoint rounding (round to even is the default rounding mode in IEEE 754 functions).
- Is this behavior specified as part of the ECMAScript standard, or is it implementation dependent?
console.log(9007199254740993)
involves two conversions: one to convert the decimal literal in the JavaScript code to the internal binary floating-point format, and a second conversion from the internal binary format to a decimal string for console output when executing thelog
method. Each of those conversions needs quite a sophisticated algorithm (if it's done well). So it's not really true to say that there's no algorithm. – Flaxman