Wouldn't it have made more sense to make long 64-bit and reserve long long until 128-bit numbers become a reality?
Yes, it does make sense, but Microsoft had their own reasons for defining "long" as 32-bits.
As far as I know, of all the mainstream systems right now, Windows is the only OS where "long" is 32-bits. On Unix and Linux, it's 64-bit.
All compilers for Windows will compile "long" to 32-bits on Windows to maintain compatibility with Microsoft.
For this reason, I avoid using "int" and "long". Occasionally I'll use "int" for error codes and booleans (in C), but I never use them for any code that is dependent on the size of the type.
long
in cases where 32 bits is big enough, and I don't want int32_least_t
or my own typedef all over my code. It's probably best to make the dependency obvious and explicit, and if it's in a struct you'd probably use int32_t
to avoid bloating it where long
is bigger, but there does come a point of "can't be bothered with this". –
Anarchic long
. Hardly "all the mainstream systems ... it's 64-bit". –
Aun The c standard have NOT specified the bit-length of primitive data type, but only the least bit-length of them. So compilers can have options on the bit-length of primitive data types. On deciding the bit-length of each primitive data type, the compiler designer should consider the several factors, including the computer architecture.
here is some references: http://en.wikipedia.org/wiki/C_syntax#Primitive_data_types
For the history, including why UNIX systems generally converged on LP64, and why Windows did not (big code base that had int 16 and long 32), and the various arguments: The Long Road to 64 Bits - Double, double, toil and trouble—Shakespeare, Macbeth https://queue.acm.org/detail.cfm?id=1165766 Queue 2006 OR https://dl.acm.org/doi/pdf/10.1145/1435417.1435431 CACM 2009
Note: I helped design the 64/32-bit MIPS R4000, suggested the idea that led to <inttypes.h>, and wrote the long long motivation section for C99.
For historical reasons. For a long time (pun intended), "int" meant 16-bit; hence "long" as 32-bit. Of course, times changed. Hence "long long" :)
PS:
GCC (and others) currently support 128 bit integers as "(u)int128_t".
PPS:
Here's a discussion of why the folks at GCC made the decisions they did:
http://www.x86-64.org/pipermail/discuss/2005-August/006412.html
Ever since the days of the first C compiler for a general-purpose reprogrammable microcomputer, it has often been necessary for code to make use of types that held exactly 8, 16, or 32 bits, but until 1999 the Standard didn't explicitly provide any way for programs to specify that. On the other hand, nearly all compilers for 8-bit, 16-bit, and 32-bit microcomputers define "char" as 8 bits, "short" as 16 bits, and "long" as 32 bits. The only difference among them is whether "int" is 16 bits or 32.
While a 32-bit or larger CPU could use "int" as a 32-bit type, leaving "long" available as a 64-bit type, there is a substantial corpus of code which expects that "long" will be 32 bits. While the C Standard added "fixed-sized" types in 1999, there are other places in the Standard which still use "int" and "long", such as "printf". While C99 added macros to supply the proper format specifiers for fixed-sized integer types, there is a substantial corpus of code which expects that "%ld" is a valid format specifier for int32_t, since it will work on just about any 8-bit, 16-bit, or 32-bit platform.
Whether it makes more sense to have "long" be 32 bits, out of respect for an existing code base going back decades, or 64 bits, so as to avoid the need for the more verbose "long long" or "int64_t" to identify the 64-bit types is probably a judgment call, but given that new code should probably favor the use of specified-size types when practical, I'm not sure I see a compelling advantage to making "long" 64 bits unless "int" is also 64 bits (which will create even bigger problems with existing code).
d 32-bit microcomputers define "char" as 8 bits, "short" as 16 bits, and "long" as 32 bits. The only difference among them is whether "int" is 16 bits or 32.
While a 32-bit or larger CPU could use "int" as a 32-bit type, leaving "long" available as a 64-bit type, there is a substantial corpus of code which expects that "long" will be 32 bits. While the C Standard added "fixed-sized" types in 1999, there are other places in the Standard which still use "int" and "long", such as "printf". While C99 added macros to supply the
Sizes of long
and long long
are implementation defined, all we know are:
- minimum size guarantees
- relative sizes between the types
5.2.4.2.1 Sizes of integer types <limits.h>
gives the minimum sizes:
1 [...] Their implementation-defined values shall be equal or greater in magnitude (absolute value) to those shown [...]
- UCHAR_MAX 255 // 2 8 − 1
- USHRT_MAX 65535 // 2 16 − 1
- UINT_MAX 65535 // 2 16 − 1
- ULONG_MAX 4294967295 // 2 32 − 1
- ULLONG_MAX 18446744073709551615 // 2 64 − 1
6.2.5 Types then says:
8 For any two integer types with the same signedness and different integer conversion rank (see 6.3.1.1), the range of values of the type with smaller integer conversion rank is a subrange of the values of the other type.
and 6.3.1.1 Boolean, characters, and integers determines the relative conversion ranks:
1 Every integer type has an integer conversion rank defined as follows:
- The rank of long long int shall be greater than the rank of long int, which shall be greater than the rank of int, which shall be greater than the rank of short int, which shall be greater than the rank of signed char.
- The rank of any unsigned integer type shall equal the rank of the corresponding signed integer type, if any.
- For all integer types T1, T2, and T3, if T1 has greater rank than T2 and T2 has greater rank than T3, then T1 has greater rank than T3
© 2022 - 2025 — McMap. All rights reserved.
long
is 32 bit, nor thatint
is 32 bit, nor thatlong long
is 64 bit. This all depends very much on the compiler... So your question is based on a false premise. – Flitterlong long
can't be "reserved", since the C99 standard guarantees its existence. On a 16-bit system with a 16-bitint
, 32-bitlong
and 64-bitlong long
they'd all be different, but those days are gone as far as desktop machines are concerned. We're not going to stick with 16-bitint
just so that we don't feel there's a redundant type in the middle somewhere. – Anarchic