Objective-C : BOOL vs bool
Asked Answered
C

11

201

I saw the "new type" BOOL (YES, NO).

I read that this type is almost like a char.

For testing I did :

NSLog(@"Size of BOOL %d", sizeof(BOOL));
NSLog(@"Size of bool %d", sizeof(bool));

Good to see that both logs display "1" (sometimes in C++ bool is an int and its sizeof is 4)

So I was just wondering if there were some issues with the bool type or something ?

Can I just use bool (that seems to work) without losing speed?

Carom answered 12/2, 2009 at 13:46 Comment(0)
H
209

From the definition in objc.h:

#if (TARGET_OS_IPHONE && __LP64__)  ||  TARGET_OS_WATCH
typedef bool BOOL;
#else
typedef signed char BOOL; 
// BOOL is explicitly signed so @encode(BOOL) == "c" rather than "C" 
// even if -funsigned-char is used.
#endif

#define YES ((BOOL)1)
#define NO  ((BOOL)0)

So, yes, you can assume that BOOL is a char. You can use the (C99) bool type, but all of Apple's Objective-C frameworks and most Objective-C/Cocoa code uses BOOL, so you'll save yourself headache if the typedef ever changes by just using BOOL.

Homy answered 13/2, 2009 at 0:43 Comment(4)
"all of Apple's frameworks" - not true. Take a look at CGGeometry.h, specifically: CG_INLINE bool __CGPointEqualToPoint(CGPoint point1, CGPoint point2) { return point1.x == point2.x && point1.y == point2.y; }Dias
@Dias You are correct. Many of the C frameworks (CoreFoundation, CoreGraphics, etc.) use C99 bool. All of the Objective-C frameworks use BOOL.Homy
@Cœur you have edited the definition of BOOL code sample, but the text below stays the same. It's a little bit confusing and is incorrect. See my answer.Promise
Got to know different behaviour take a look at below. NSInteger progressTime = 2;//any value NSInteger totalTime = 1;//any value BOOL success = (progressTime>=totalTime)//it is always gives NO But Once I received that (progressTime>=totalTime) value into bool type success it returns the correct result. I don't understand this behaviour. I am using Xcode 7.x and the iOS version was 8.x. @BarryWarkExempt
S
36

As mentioned above, BOOL is a signed char. bool - type from C99 standard (int).

BOOL - YES/NO. bool - true/false.

See examples:

bool b1 = 2;
if (b1) printf("REAL b1 \n");
if (b1 != true) printf("NOT REAL b1 \n");

BOOL b2 = 2;
if (b2) printf("REAL b2 \n");
if (b2 != YES) printf("NOT REAL b2 \n");

And result is

REAL b1
REAL b2
NOT REAL b2

Note that bool != BOOL. Result below is only ONCE AGAIN - REAL b2

b2 = b1;
if (b2) printf("ONCE AGAIN - REAL b2 \n");
if (b2 != true) printf("ONCE AGAIN - NOT REAL b2 \n");

If you want to convert bool to BOOL you should use next code

BOOL b22 = b1 ? YES : NO; //and back - bool b11 = b2 ? true : false;

So, in our case:

BOOL b22 = b1 ? 2 : NO;
if (b22)    printf("ONCE AGAIN MORE - REAL b22 \n");
if (b22 != YES) printf("ONCE AGAIN MORE- NOT REAL b22 \n");

And so.. what we get now? :-)

Seto answered 22/6, 2011 at 9:1 Comment(2)
You could, instead of using the ternary operator use !!b1. To convert between themReena
'NOT REAL b2' is not printed on my iPhone SE simulator.Ramsgate
A
13

At the time of writing this is the most recent version of objc.h:

/// Type to represent a boolean value.
#if (TARGET_OS_IPHONE && __LP64__)  ||  TARGET_OS_WATCH
#define OBJC_BOOL_IS_BOOL 1
typedef bool BOOL;
#else
#define OBJC_BOOL_IS_CHAR 1
typedef signed char BOOL; 
// BOOL is explicitly signed so @encode(BOOL) == "c" rather than "C" 
// even if -funsigned-char is used.
#endif

It means that on 64-bit iOS devices and on WatchOS BOOL is exactly the same thing as bool while on all other devices (OS X, 32-bit iOS) it is signed char and cannot even be overridden by compiler flag -funsigned-char

It also means that this example code will run differently on different platforms (tested it myself):

int myValue = 256;
BOOL myBool = myValue;
if (myBool) {
    printf("i'm 64-bit iOS");
} else {
    printf("i'm 32-bit iOS");
}

BTW never assign things like array.count to BOOL variable because about 0.4% of possible values will be negative.

Annaleeannaliese answered 30/10, 2015 at 15:24 Comment(1)
the compiler will raise an error if you use bool as a parameter of a block that is defined to get a BOOL (UIView animation blocks, e.g.), when you compile for iOS 32 bits (iPhone 5C...). I use C++ bool everywhere in my code, and BOOL in the APIs that are defined with BOOLLabbe
W
8

The Objective-C type you should use is BOOL. There is nothing like a native boolean datatype, therefore to be sure that the code compiles on all compilers use BOOL. (It's defined in the Apple-Frameworks.

Wafd answered 12/2, 2009 at 14:8 Comment(1)
This isn't strictly accurate. BOOL is defined by the Objective-C language (it's in one of the objc/*.h headers), not by the frameworks. Also, when compiling with C99 (which I think is the default), then there is a native Boolean type, _Bool (or bool if stdbool.h is included).Gully
B
5

Yup, BOOL is a typedef for a signed char according to objc.h.

I don't know about bool, though. That's a C++ thing, right? If it's defined as a signed char where 1 is YES/true and 0 is NO/false, then I imagine it doesn't matter which one you use.

Since BOOL is part of Objective-C, though, it probably makes more sense to use a BOOL for clarity (other Objective-C developers might be puzzled if they see a bool in use).

Boysenberry answered 12/2, 2009 at 14:2 Comment(1)
_Bool is defined in C99, and in the standard header stdbool.h, the macro bool is defined (which expands to _Bool) and true/false are defined here as well.Emendate
L
4

Another difference between bool and BOOL is that they do not convert exactly to the same kind of objects, when you do key-value observing, or when you use methods like -[NSObject valueForKey:].

As everybody has said here, BOOL is char. As such, it is converted to an NSNumber holding a char. This object is indistinguishable from an NSNumber created from a regular char like 'A' or '\0'. You have totally lost the information that you originally had a BOOL.

However, bool is converted to an CFBoolean, which behaves the same as NSNumber, but which retains the boolean origin of the object.

I do not think that this is an argument in a BOOL vs. bool debate, but this may bite you one day.

Generally speaking, you should go with BOOL, since this is the type used everywhere in the Cocoa/iOS APIs (designed before C99 and its native bool type).

Linkboy answered 8/5, 2012 at 8:12 Comment(0)
P
2

The accepted answer has been edited and its explanation become a bit incorrect. Code sample has been refreshed, but the text below stays the same. You cannot assume that BOOL is a char for now since it depends on architecture and platform. Thus, if you run you code at 32bit platform(for example iPhone 5) and print @encode(BOOL) you will see "c". It corresponds to a char type. But if you run you code at iPhone 5s(64 bit) you will see "B". It corresponds to a bool type.

Promise answered 13/2, 2016 at 13:20 Comment(0)
E
2

As mentioned above BOOL could be an unsigned char type depending on your architecture, while bool is of type int. A simple experiment will show the difference why BOOL and bool can behave differently:

bool ansicBool = 64;
if(ansicBool != true) printf("This will not print\n");

printf("Any given vlaue other than 0 to ansicBool is evaluated to %i\n", ansicBool);

BOOL objcBOOL = 64;
if(objcBOOL != YES) printf("This might print depnding on your architecture\n");

printf("BOOL will keep whatever value you assign it: %i\n", objcBOOL);

if(!objcBOOL) printf("This will not print\n");

printf("! operator will zero objcBOOL %i\n", !objcBOOL);

if(!!objcBOOL) printf("!! will evaluate objcBOOL value to %i\n", !!objcBOOL);

To your surprise if(objcBOOL != YES) will evaluates to 1 by the compiler, since YES is actually the character code 1, and in the eyes of compiler, character code 64 is of course not equal to character code 1 thus the if statement will evaluate to YES/true/1 and the following line will run. However since a none zero bool type always evaluates to the integer value of 1, the above issue will not effect your code. Below are some good tips if you want to use the Objective-C BOOL type vs the ANSI C bool type:

  • Always assign the YES or NO value and nothing else.
  • Convert BOOL types by using double not !! operator to avoid unexpected results.
  • When checking for YES use if(!myBool) instead of if(myBool != YES) it is much cleaner to use the not ! operator and gives the expected result.
Eczema answered 11/12, 2017 at 22:51 Comment(0)
F
1

I go against convention here. I don't like typedef's to base types. I think it's a useless indirection that removes value.

  1. When I see the base type in your source I will instantly understand it. If it's a typedef I have to look it up to see what I'm really dealing with.
  2. When porting to another compiler or adding another library their set of typedefs may conflict and cause issues that are difficult to debug. I just got done dealing with this in fact. In one library boolean was typedef'ed to int, and in mingw/gcc it's typedef'ed to a char.
Fighter answered 5/4, 2010 at 17:15 Comment(8)
Well... you can be expected to know the standard typedef's of your language (think size_t), and both bool (C99) and BOOL (ObjC) fall into that category. And if your code failed because of a change of typedef, it's your code to blame since you apparently did not handle the typedef as an opaque thing but relied on its implementation on one platform. (Nothing to be ashamed of, it happens, but it's not the typedef to blame.)Abbacy
The "standard" typedefs don't seem to be very standard (for example for a while MS didn't support posix standards, etc). If you don't use typedefs then the problem with typedefs changing or being different on different compilers is eliminated.Fighter
-1, typedefs in general serve two important purposes (among others): provide good semantics and provide some misdirection. Ideally you shouldn't need to know the base type a typedef refers to, unfortunately this system is not perfect and sometimes you have to know. My point is: you should follow the convention because even admitting that its not perfect its better than the alternative.Decaliter
@João PortelaL: "good semantics" is not objective. Adding misdirection helps how exactly?Fighter
@DevSolar: "You can be expected to know the standard typedef's of your language"?? Typedefs are not standardized. They're different for every developer, for every compiler implementation, and compiler version even. So I'm at fault because two different packages I didn't write used conflicting typedefs? Standards are great, everyone should have lots.Fighter
@Jay: Sorry, I should have explained why this "misdirection" is good. I will try to provide an example: if you use the typedef'd boolean instead of using an int or a char directly you're allowing a different type (that still works) to be used on each platform without breaking your code [reasons for this vary but we can imagine a platform where a char may be misaligned in memory and thus slower so an int may be used for a boolean instead].Decaliter
@Jay: By "good semantics" I mean that when you declare your boolean BOOL varname instead of char varname it is more obvious that the two valid values for that variable are true/YES or false/NO.Decaliter
There are also binary compatibility reasons why you absolutely MUST use typedefs when system API has them. While on iOS 64 bit, BOOL may map to bool, that's not true for 32-bit or Mac (for historical reasons) and using bool may cause actual errors. There are also maintenance reasons. E.g. code by Classic developers who used SInt32 instead of long still compiles correctly today on 64-bit. Those who ignored that typedef used by system API suddenly got bugs serializing their data etc.Il
R
1

Also, be aware of differences in casting, especially when working with bitmasks, due to casting to signed char:

bool a = 0x0100;
a == true;  // expression true

BOOL b = 0x0100;
b == false; // expression true on !((TARGET_OS_IPHONE && __LP64__) || TARGET_OS_WATCH), e.g. MacOS
b == true;  // expression true on (TARGET_OS_IPHONE && __LP64__) || TARGET_OS_WATCH

If BOOL is a signed char instead of a bool, the cast of 0x0100 to BOOL simply drops the set bit, and the resulting value is 0.

Rarefaction answered 29/8, 2019 at 20:29 Comment(0)
R
0

The BOOL Objective-C type is in fact either a type alias to bool (from C's stdbool.h) or to signed char. In case you are curious, turns out which one is chosen where is rather complicated, and depends on the target platform and architecture.

While the Objective-C runtime is capable of making its own decisions about it, at least for Darwin, LLVM is always explicit on which one to choose:

  • On 64-bit ARM, BOOL is always a type alias to bool
  • On 32-bit ARM, BOOL is a signed char for all platforms except for watchOS (where it is bool)
  • On 32-bit Intel, BOOL is a signed char for all platforms except for watchOS (where it is bool)
  • On 64-bit Intel, BOOL is a signed char for all platforms except for iOS (where it is bool)

You can read more about my investigation here: https://www.jviotti.com/2024/01/05/is-objective-c-bool-a-boolean-type-it-depends.html.

Ravishing answered 10/1 at 12:19 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.