The program
#include <stdio.h>
int main(void) {
printf("sizeof( char ) = %zu, sizeof 'a' = %zu.\n", sizeof( char ), sizeof 'a' );
return 0;
}
outputs the following:
sizeof( char ) = 1, sizeof 'a' = 4.
I'm compiling with gcc (clang gives the same result) and these flags:
gcc -Wall -Wextra -Wswitch -pedantic -ansi -std=c11 -DDEBUG -ggdb3 -o
Section 6.5.3.4 paragraph 4 of the specification at http://www.open-std.org/jtc1/sc22/wg14/www/docs/n1570.pdf says
4 When sizeof is applied to an operand that has type char, unsigned char, or signed char, (or a qualified version thereof) the result is 1.
So I would expect sizeof the operand 'a' to be 1 because the type of 'a' is char, or is it being automatically "promoted" to an int, or something similar? (I notice that if I cast 'a' to char then sizeof( (char)'a' ) is 1).
Or am I looking at the wrong standard?
a
is exactly equivalent to97
(on an ASCII-based machine). – Lavadalavageint
even before promotion considerations. – Gwendolyn