It is explained in the JavaScript documentation:
According to the ECMAScript standard, there is only one number type: the double-precision 64-bit binary format IEEE 754 value (numbers between -(2
53
-1)
and 2
53
-1
). There is no specific type for integers.
Wikipedia page about double precision floating point format explains:
Between 2
52
= 4,503,599,627,370,496
and 2
53
= 9,007,199,254,740,992
the representable numbers are exactly the integers. For the next range, from 2
53
to 2
54
, everything is multiplied by 2
, so the representable numbers are the even ones, etc.
(All integer numbers smaller than 2
52
are represented exactly.)
1234567890123456789
and 1234567890123456799
are larger than 2
60
= 1152921504606846976
. At this magnitude only about 1% of the integer numbers are stored exactly using the double-precision floating point format.
These two cannot be stored exactly. They both are rounded to 1234567890123456800
.
The JavaScript documentation also explains how to tell if a an integer number is stored exactly:
[...] and starting with ECMAScript 6, you are also able to check if a number is in the double-precision floating-point number range using Number.isSafeInteger()
as well as Number.MAX_SAFE_INTEGER
and Number.MIN_SAFE_INTEGER
. Beyond this range, integers in JavaScript are not safe anymore and will be a double-precision floating point approximation of the value.