We found an interesting case in our business logic that totally breaks our logic and we don't understand why NSDecimalNumber
and Decimal
behaves the way it does.
My playground for the cases is as follows:
import Foundation
let pQuantity = Decimal(string: "0.2857142857142857")!
let pPrice = Decimal(string: "7.00000000000000035")!
let calced = NSDecimalNumber(decimal: pQuantity * pPrice * Decimal(integerLiteral: 100)) // 200
let decimal = calced.decimalValue // 199.9999999999999999999999999999995
let integer = calced.intValue // 0
NSDecimalNumber(decimal: Decimal(string: "199.9999999999999999999999999999995")!).intValue // 0
NSDecimalNumber(decimal: Decimal(string: "199.9999999999999995")!).intValue // 199
NSDecimalNumber(decimal: Decimal(string: "199.99999999999999995")!).intValue // 15
NSDecimalNumber(decimal: Decimal(string: "199.999999999999999995")!).intValue // -2
In the playground code above you can see the return value if you scroll to the right, if you don't want to run it yourselves.
We need to convert our raw decimal values, quantity and price, to ints temporarily when calculating how much to evenly split these quantities to produce nice-looking prices. We can't however for some reason in this case as the initial step of conversion fails, producing a 0
instead of 200
(and yes, the current code would produce 199
which is a bug).
Why does the NSDecimalNumber return these weird values depending on the amount of decimals, ranging from -2
to 199
?
Our solution would be to round the inner calculation before putting it into NSDecimalNumber
, but we'd like to know the cause of this to begin with. Is it a bug or is it expected and one should be aware that it might happen?