There are some limits but the compiler author is free to choose the lengths for the standard C variable types (char, short, int, long, long long). Naturally char is going to be a byte for that architecture (most with C compilers are 8 bits). And naturally you cannot have something smaller be bigger than something bigger, long cannot be smaller than an int. But certainly by 1999 we saw the x86 16 to 32 bit transition and for example int changed from 16 to 32 with a number of tools but long stayed 32. Later the 32 to 64 bit x86 transition happened and depending on the tool there were types available to help.
The problem existed long before this and the solution was not to fix the lengths of the types, they are, within rules, up to the compiler authors as to size. But the compiler authors need to craft a stdint.h file that matches the tool and target (stdint.h is specific to a tool and target at a minimum and can be version of tool and build options for that tool, etc). So that, for example, uint32_t is always 32 bits. Some authors will convert that to an int others a long, etc in their stdint.h. The C language variable types remain limited to char, short, int, etc per the language (uint32_t is not a variable type it is converted to a variable type through stdint.h). This solution/workaround was a way to keep is from all going crazy and keep the language alive.
Authors will often choose for example if the GPRs are 16 bit to have int be 16 bit, and if 32 bit be 32 bit and so on, but they have some freedom.
Yes, this specifically means that there is no reason to assume that any two tools for a particular target (the computer you are reading this on for example) use the same definitions for int and long in particular, and if you want to write code for this platform that can port across these tools (that support this platform) then use the stdint.h types and not int, long, etc...Most certainly if you are crossing platforms an msp430 mcu, an arm mcu, an arm linux machine, an x86 based machine, that the types, even for the same "toolchain" (gnu gcc and binutils for example), do not have the same definitions for int and long, etc. char and short tend to be 8 and 16 bits, int and long tend to vary the most, sometimes the same size as each other sometimes different, but the point is do not assume.
It is trivial to detect the sizes, for a compiler version/target/command line options, or just go the stdint route to minimize problems later.
long
was not the norm in those days; 4-bytelong
was more common on 16- and 32-bit platforms (and still is). Of course 64-bit machines were rare in 1999. – Longsufferancelong
is 4 bytes in Visual C and 8 in gcc and for that reason I never use it. – Eroselong
then?int
(doesn't that depend on the architecture too?) – Facetiouslong long
clearly. – Vestureint64_t
etc, although thescanf()
andprintf()
formatting specifiers are awkward and a pain to remember. – Eroselong
is 32bit even in gcc (assuming the native MinGW rather then Cygwin version that is). en.wikipedia.org/wiki/64-bit_computing#64-bit_data_models – Tiddlylong long
break compatibility given code bases up to then did not havelong long
? Not being the widest integer type? I certainly see how requiringlong
as 64-bit would break things. – Punlong long
, it was the C89 guarantee tyuarantee that a 64-bit integer type would be available, and by the late 1990s it was blindingly obvious that 64-bit integer types were necessary and their existence must be guaranteed. Borrowing from the term "future-proof", that C89 guarantee "future-precluded" C and needed to be tossed. – Lw