I seem to be encountering a strange issue in Objective-C converting a float to an NSNumber (wrapping it for convenience) and then converting it back to a float.
In a nutshell, a class of mine has a property red
, which is a float from 0.0
to 1.0
:
@property (nonatomic, assign) float red;
This object is comparing itself to a value that is loaded from disk, for synchronization purposes. (The file can change outside the application, so it checks periodically for file changes, loads the alternate version into memory, and does a comparison, merging differences.)
Here's an interesting snippet where the two values are compared:
if (localObject.red != remoteObject.red) {
NSLog(@"Local red: %f Remote red: %f", localObject.red, remoteObject.red);
}
Here's what I see in the logs:
2011-10-28 21:07:02.356 MyApp[12826:aa63] Local red: 0.205837 Remote red: 0.205837
Weird. Right? How is this piece of code being executed?
The actual value as stored in the file:
...red="0.205837"...
Is converted to a float
using:
currentObject.red = [[attributeDict valueForKey:@"red"] floatValue];
At another point in the code I was able to snag a screenshot from GDB. It was printed to NSLog as: (This is also the precision with which it appears in the file on disk.)
2011-10-28 21:21:19.894 MyApp[13214:1c03] Local red: 0.707199 Remote red: 0.707199
But appears in the debugger as:
How is this level of precision being obtained at the property level, but not stored in the file, or printed properly in NSLog? And why does it seem to be varying?
%+0.16f
instead of%f
. Or whatever precision you want instead of.16
. If not, disregard this comment ). – Signorina3.14159
exactly, the string would be@"3.14159"
(using "%0.16f"), but if you had "%0.4f", then it would round to@"3.1416"
. – Signorina