What is the size of an enum in C?
Asked Answered
K

7

170

I'm creating a set of enum values, but I need each enum value to be 64 bits wide. If I recall correctly, an enum is generally the same size as an int; but I thought I read somewhere that (at least in GCC) the compiler can make the enum any width they need to be to hold their values. So, is it possible to have an enum that is 64 bits wide?

Kandis answered 14/12, 2008 at 1:7 Comment(5)
So if I understand well, 2^32 enums are not enough for you ? Or is it an alignement concern, why do you need those to be 64 instead of 32, I'm very curious.Bolitho
@jokoon: I honestly don't remember anymore. I think I wanted the enums to contain values larger than 2^32-1.Kandis
One use would be if you needed a union between an enum and a pointer.Mixon
An important consideration in the size of enum is in fact memory use. Is memory optimization dead or something, or does everyone think the compiler is the center of the universe still and it automagically makes everything fast and optimal without any effort on the part of the programmer? It's absurd to use a larger data type than you need, and if I only need 256 values or less for my enum, then why do I need 16 or 32-bit words to store them? (Data model isn't an excuse. The values usually are quite easily sign-extended such as when stored in the registers.)Univalence
@Univalence Enum values don't need to be contiguous. In embedded systems it's common to use constants and/or enums to name single bits in a memory mapped I/O interface. So it's not at all unreasonable to want large enum values like 1<<31 and 1<<30 with huge gaps in between them.Archerfish
A
113

An enum is only guaranteed to be large enough to hold int values. The compiler is free to choose the actual type used based on the enumeration constants defined so it can choose a smaller type if it can represent the values you define. If you need enumeration constants that don't fit into an int you will need to use compiler-specific extensions to do so.

Allegraallegretto answered 14/12, 2008 at 1:16 Comment(6)
Your first sentence seems to conflict with your last. Is the constraint that an enum should be larger than an int or smaller? Following @MichaelStum 's answer your first sentence should be "An enum is only guaranteed to fit into an int value."Huehuebner
As an ugly implementation sensitive hack on twos complement platforms (is that all systems these days?), you can force an enum to be as large as an int by making sure it contains negative values. Not a recommended technique though.Integrant
This answer seems to suggest that an enum is as large as an int. Michael Stum's answer, which references C99, says that an enum may be as small as a char.Bedim
The first sentence of this answer is incorrect. The enum is only guaranteed to be large enough to hold the value of the largest enumerator in the enum.Musing
@M.M, he meant to hold a maximum of int values. It is a correct statement. Your statement is actually incorrect. An enum will not hold the largest enumerator if that enumerator is larger than an int. I learned this by having my 64-bit enumerator truncated to the maximum of int (which is 32-bit in my case).Pseudocarp
@jdk1.0 in Standard C all enumerators have type int . The behaviour you described is compiler-specific extensionsMusing
I
123

Taken from the current C Standard (C99): http://www.open-std.org/JTC1/SC22/WG14/www/docs/n1256.pdf

6.7.2.2 Enumeration specifiers
[...]
Constraints
The expression that defines the value of an enumeration constant shall be an integer constant expression that has a value representable as an int.
[...]
Each enumerated type shall be compatible with char, a signed integer type, or an unsigned integer type. The choice of type is implementation-defined, but shall be capable of representing the values of all the members of the enumeration.

Not that compilers are any good at following the standard, but essentially: If your enum holds anything else than an int, you're in deep "unsupported behavior that may come back biting you in a year or two" territory.

Update: The latest publicly available draft of the C Standard (C11): http://www.open-std.org/JTC1/SC22/WG14/www/docs/n1570.pdf contains the same clauses. Hence, this answer still holds for C11.

Identical answered 14/12, 2008 at 1:23 Comment(7)
having only that, the following is valid i think: enum { LAST = INT_MAX, LAST1, LAST2 }; so LAST2 is not representable in int, but there wasn't an expression defining it.Toehold
In the actual PDF it defines that: "The identifiers in an enumerator list are declared as constants that have type int[...]". I've omitted that to make it not too verbose.Identical
Note "a signed integer type, or an unsigned integer type". Not necessarily int. short and long are integer types too, and whatever the implementation picks, all values must fit ("shall be capable of representing the values of all the members of the enumeration").Primulaceous
Notable: enumeration constant and enumerated type is not the same thing. The former are the contents of the enum declaration list, the latter is the actual variable. So while the enumeration constants must be int, the actual enumeration variable could be another type. This is a well-known inconsistency in the standard.Fiscal
To clarify Lundin's point: For enum my_enum { my_value }, my_value will have type int, but enum my_enum can have an implementation defined type which must at least represent all the enumeration values. So my_value may have a narrowing conversion to enum my_enum, but it's guaranteed not to overflow.Breast
armcc gives the compiler option --enum_is_int.Viewpoint
This answer is more appropriate than the accepted one.Bumblebee
A
113

An enum is only guaranteed to be large enough to hold int values. The compiler is free to choose the actual type used based on the enumeration constants defined so it can choose a smaller type if it can represent the values you define. If you need enumeration constants that don't fit into an int you will need to use compiler-specific extensions to do so.

Allegraallegretto answered 14/12, 2008 at 1:16 Comment(6)
Your first sentence seems to conflict with your last. Is the constraint that an enum should be larger than an int or smaller? Following @MichaelStum 's answer your first sentence should be "An enum is only guaranteed to fit into an int value."Huehuebner
As an ugly implementation sensitive hack on twos complement platforms (is that all systems these days?), you can force an enum to be as large as an int by making sure it contains negative values. Not a recommended technique though.Integrant
This answer seems to suggest that an enum is as large as an int. Michael Stum's answer, which references C99, says that an enum may be as small as a char.Bedim
The first sentence of this answer is incorrect. The enum is only guaranteed to be large enough to hold the value of the largest enumerator in the enum.Musing
@M.M, he meant to hold a maximum of int values. It is a correct statement. Your statement is actually incorrect. An enum will not hold the largest enumerator if that enumerator is larger than an int. I learned this by having my 64-bit enumerator truncated to the maximum of int (which is 32-bit in my case).Pseudocarp
@jdk1.0 in Standard C all enumerators have type int . The behaviour you described is compiler-specific extensionsMusing
D
25

While the previous answers are correct, some compilers have options to break the standard and use the smallest type that will contain all values.

Example with GCC (documentation in the GCC Manual):

enum ord {
    FIRST = 1,
    SECOND,
    THIRD
} __attribute__ ((__packed__));
STATIC_ASSERT( sizeof(enum ord) == 1 )
Desai answered 5/6, 2014 at 18:18 Comment(2)
Actually, as far as I can see this does not break the standard. As explained in Michael Stum's answer, the standard allows the compiler to choose the actual type of the enums, as long as all values fit.Spall
I've worked with MacOS C++ compilers that do exploit the limited range of values in an enum to store them in smaller types. Can't remember if it was Metrowerks Codewarrior or XCode. This is within the C++ standard. You cannot assume sizeof(MyEnum) == sizeof(int) in general.Integrant
I
0

Just set the last value of the enum to a value large enough to make it the size you would like the enum to be, it should then be that size:

enum value{a=0,b,c,d,e,f,g,h,i,j,l,m,n,last=0xFFFFFFFFFFFFFFFF};
Ideate answered 14/1, 2020 at 22:21 Comment(3)
While this code may answer the question, providing additional context regarding how and/or why it solves the problem would improve the answer's long-term value.Uxoricide
This is false. Compiling your example with gcc 8.4.0: sizeof(a) is 4 and sizeof(last) is 8.Supplementary
typedef enum {a1=0,b1,c1,d1,f1,last1=0xFFFFFFFFFFFFFFFF} testing1; typedef enum {a2=0,b2,c2,d2,f2,last2=0xFFFFFFFF} testing2; sizeof a variable of type testing1 is 8 and and sizeof a variable of type testing2 is 4..Ideate
P
-1

We have no control over the size of an enum variable. It totally depends on the implementation, and the compiler gives the option to store a name for an integer using enum, so enum is following the size of an integer.

Pheasant answered 23/1, 2017 at 9:53 Comment(0)
S
-2

In C language, an enum is guaranteed to be of size of an int. There is a compile time option (-fshort-enums) to make it as short (This is mainly useful in case the values are not more than 64K). There is no compile time option to increase its size to 64 bit.

Sikhism answered 29/12, 2017 at 14:2 Comment(1)
not a short, just sometimes shorterAcquirement
C
-11

Consider this code:

enum value{a,b,c,d,e,f,g,h,i,j,l,m,n};
value s;
cout << sizeof(s) << endl;

It will give 4 as output. So no matter the number of elements an enum contains, its size is always fixed.

Coffeepot answered 14/11, 2015 at 20:31 Comment(3)
Michael Stum's answer is correct. This is compiler specific. You can try it out yourself with IAR EWARM. IAR EWARM shows 1 for your example. If there is up to 255 items it still shows 1. After adding 256th item it goes up to 2.Menendez
The question is not about C++.Morley
important things to realise before writing any more C or C++: Just because it compiles doesn't mean it's legal per the Standard. Just because you get a given result doesn't mean the Standard says that you always will or that other users will when they run your code. Questions like this need an answer that references the Standard or at least implementation-defined spec for a given compiler/ABI. Simply compiling and running a program and seeing one result on one day conveys no lesson about such questions (and very little about anything else).Peddada

© 2022 - 2024 — McMap. All rights reserved.