I have an NSString which I want to convert into an NSDecimalNumber
. The string is received from a server and is always formatted using the en_US
locale like XXX.YYY
and not like XXX,YYY
. I want to create an NSDecimalNumber
which accepts XXX.YYY
regardless of the locale the user. The number is never displayed to the user, it's used to do internal math.
Normally you'd do something like this:
NSDecimalNumber *n = [NSDecimalNumber decimalNumberWithString:@"1.234"];
However, if the user is running the fr_FR
locale of Mac OS X, that will break. en_US
will interpret it as one point two three four, where-as fr_FR
will interpret it as one-thousand two-hundred thirty-four, both very different numbers.
The obvious solution is to use +decimalNumberWithString:locale:
. But I'm not sure what to pass as an argument to the locale:
parameter which will work on all international versions of Mac OS X. My best guess is to do this:
NSLocale *l = [[[NSLocale alloc] initWithLocaleIdentifier:@"en_US"] autorelease];
NSDecimalNumber *n = [NSDecimalNumber decimalNumberWithString:@"1.234" locale:l];
Is that the best way to do this, and is it safe? Will the @"en_US"
local identifier be available on all international versions of Mac OS X, or might it return a nil
NSLocale in some cases?
UPDATE 1: This appears to work well, and is more explicit:
NSDictionary *l = [NSDictionary dictionaryWithObject:@"." forKey:NSLocaleDecimalSeparator];
NSDecimalNumber *n = [NSDecimalNumber decimalNumberWithString:@"1.234" locale:l];
NSLocaleDecimalSeparator
is a constant defined for NSLocale, which responds to -objectWithKey:
, just like NSDictionary. You can feed it any decimal mark separator you want, like @","
.
I think this is likely the best answer to the question, unless anyone else has a better idea?