I'm trying to subtract two unsigned ints and compare the result to a signed int (or a literal). When using unsigned int
types the behavior is as expected. When using uint16_t
(from stdint.h
) types the behavior is not what I would expect. The comparison was done using gcc 4.5.
Given the following code:
unsigned int a;
unsigned int b;
a = 5;
b = 20;
printf("%u\n", (a-b) < 10);
The output is 0, which is what I expected. Both a and b are unsigned, and b is larger than a, so the result is a large unsigned number which is greater than 10. Now if I change a and b to type uint16_t:
uint16_t a;
uint16_t b;
a = 5;
b = 20;
printf("%u\n", (a-b) < 10);
The output is 1. Why is this? Is the result of subtraction between two uint16_t types stored in an int in gcc? If I change the 10
to 10U
the output is again 0, which seems to support this (if the subtraction result is stored as an int and the comparison is made against an unsigned int than the subtraction results will be converted to an unsigned int).
int
subtraction, i.e. bothuint16_t
operands are converted toint
before the subtraction even has a chance to begin. Read about integral promotions. – Stibine