The answer to this question depends on the historical context, since the specification of the language changed with time. And this matter happens to be the one affected by the changes.
You said that you were reading K&R. The latest edition of that book (as of now), describes the first standardized version of C language - C89/90. In that version of C language writing one member of union and reading another member is undefined behavior. Not implementation defined (which is a different thing), but undefined behavior. The relevant portion of the language standard in this case is 6.5/7.
Now, at some later point in evolution of C (C99 version of language specification with Technical Corrigendum 3 applied) it suddenly became legal to use union for type punning, i.e. to write one member of the union and then read another.
Note that attempting to do that can still lead to undefined behavior. If the value you read happens to be invalid (so called "trap representation") for the type you read it through, then the behavior is still undefined. Otherwise, the value you read is implementation defined.
Your specific example is relatively safe for type punning from int
to char[2]
array. It is always legal in C language to reinterpret the content of any object as a char array (again, 6.5/7).
However, the reverse is not true. Writing data into the char[2]
array member of your union and then reading it as an int
can potentially create a trap representation and lead to undefined behavior. The potential danger exists even if your char array has sufficient length to cover the entire int
.
But in your specific case, if int
happens to be larger than char[2]
, the int
you read will cover uninitialized area beyond the end of the array, which again leads to undefined behavior.
u.ch[0]=3;
? Why should you have a warning about that?char
is only the shortest of integer types, why should it be prevented to receive values written in decimal? Nothing prevents to useint x='c';
either. Which of "signed char" or "unsigned char" should be reserved for ASCII codes, in your interpretation, and what use would there be for the other? – Normanchar
variables. In fact a literal char is anint
. Try this, in both C and C++:printf("sizeof literal char: %d\n", (int)sizeof 'X');
. – Liatrice