Converting string to base36 inconsistencies between languages.
Asked Answered
H

1

6

I have noticed some inconsistencies between Python and JavaScript when converting a string to base36.

Python Method:

>>> print int('abcdefghijr', 36)

Result: 37713647386641447

Javascript Method:

<script>
    document.write(parseInt("abcdefghijr", 36));
</script>

Result: 37713647386641450

What causes the different results between the two languages? What would be the best approach to produce the same results irregardless of the language?

Thank you.

Hyacinthie answered 17/10, 2012 at 13:50 Comment(1)
See #5812596 ; javascript doesn't have arbitrary precision big integers like Python does. (just typing 37713647386641447 into the javascript console will give you the same rounded value)Digamma
R
10

That number takes 56 bits to represent. JavaScript's numbers are actually double-precision binary floating point numbers, or double for short. These are 64 bit in total, and can represent a far wider range of values than a 64 bit integers, but due to how they achieve that (they represent a number as mantissa * 2^exponent), they cannot represent all numbers in that range, just the ones that are a multiple of 2^exponent where the multiple fits into the mantissa (which includes 2^0 = 1, so you get all integers the mantissa can handle directly). The mantissa is 53 bits, which is insufficient for this number. So it gets rounded to a number which can be represented.

What you can do is use an arbitrary precision number type defined by a third party library like gwt-math or Big.js. These numbers aren't hard to implement if you know your school arithmetic. Doing it efficiently is another matter, but also an area of extensive research. And not your problem if you use an existing library.

Radish answered 17/10, 2012 at 13:55 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.