This might be a very simple question for some but I would like to know the meaning of arbitrary precision which appears in the first line in JavaDoc of BigInteger :
Immutable arbitrary-precision integers .
This might be a very simple question for some but I would like to know the meaning of arbitrary precision which appears in the first line in JavaDoc of BigInteger :
Immutable arbitrary-precision integers .
The term fixed precision means that there are only a certain number of significant digits retained in the internal representation. This means that you would not be able to represent every integer with a magnitude greater than some threshold.
With arbitrary precision integers, the integers can be as large as you need ("arbitrarily large") and the library will keep all the digits down to the least significant unit. (This is obviously limited by the amount of memory in your computer.)
It means that BigInteger
uses as much space as is needed to save the whole value.
Take int
as an example. It has a fixed amount of bits available. With that you can save values between -2,147,483,648
and 2,147,483,647
(inclusive). So it is a fixed-precision type and not an arbitrary-precision type. It can not store values outside of this range.
With BigInteger
, you don't have that problem, because once the assigned bits are not enough to store the exact value, BigInteger
will just add some bits so it can handle the value again.
Arbitrary is actually not really true, because there are limits due to the fact that there is only a finite amount of memory available. That limit is not given by the BigInteger
class but by the environment (VM/hardware/OS).
The term fixed precision means that there are only a certain number of significant digits retained in the internal representation. This means that you would not be able to represent every integer with a magnitude greater than some threshold.
With arbitrary precision integers, the integers can be as large as you need ("arbitrarily large") and the library will keep all the digits down to the least significant unit. (This is obviously limited by the amount of memory in your computer.)
© 2022 - 2024 — McMap. All rights reserved.