I was trying to write some macros for type safe use of _Bool
and then stress test my code. For evil testing purposes, I came up with this dirty hack:
_Bool b=0;
*(unsigned char*)&b = 42;
Given that _Bool
is 1 byte on the implementation sizeof(_Bool)==1
), I don't see how this hack violates the C standard. It shouldn't be a strict aliasing violation.
Yet when running this program through various compilers, I get problems:
#include <stdio.h>
int main(void)
{
_Static_assert(sizeof(_Bool)==1, "_Bool is not 1 byte");
_Bool b=0;
*(unsigned char*)&b = 42;
printf("%d ", b);
printf("%d", b!=0 );
return 0;
}
(The code relies on printf
implicit default argument promotion to int
)
Some versions of gcc and clang give output 42 42
, others give 0 0
. Even with optimizations disabled. I would have expected 42 1
.
It would seem that the compilers assume that _Bool
can only be 1
or 0
, yet at the same time it happily prints 42
in the first case.
Q1: Why is this? Does the above code contain undefined behavior?
Q2: How reliable is sizeof(_Bool)
? C17 6.5.3.4 does not mention _Bool
at all.
42 42
? The second printf can only print 1 or 0. – Helmutb != 0
for a_Bool
can be optimized to simplyb
. I'm scratching my head still though. – Helmut_Bool
optimization combines with optimizations for transforming branches to arithmetic operations, producing strange-looking results for very natural code. The optimization ofif (b) x++;
intox+=(the representation of )b;
confirms that Clang treats_Bool
representations other than0
and1
as trap values triggering UB. gcc.godbolt.org/z/wPq4zq – Attar_Bool
semantics is a hack. It would have been fine to impose all this headache on implementors if they had also added boolean and/or bit-field arrays, but, as specified, it does provide any real improvement overenum { false, true }; typedef unsigned char _Bool;
– Damagingsizeof(_Bool)
is at least 1. C17 6.7.2.1/4 footnote 124 says: "While the number of bits in a_Bool
object is at leastCHAR_BIT
, the width (number of sign and value bits) of a_Bool
may be just 1 bit." (Of course,_Bool
has no sign bit, only value bits and padding bits.) C23 final draft n3054 6.2.6.1/2 says: "The typebool
shall have one value bit and(sizeof(bool)*CHAR_BIT)
- 1 padding bits." (Spot the minor typographical error in the text: they used a hyphen character instead of a minus sign character.) – Huckaback