g++, bitfields and ADL
Asked Answered
A

1

7

g++ fails to compile following code snippet:

namespace X {
  enum En {A, B};
  bool test(En e);
}

bool check() {
  union {
    struct {
      X::En y:16;
      X::En z:16; 
    } x;
    int z;
  } zz;
  return test(zz.x.y);
}

The error it gives is following

In function 'bool check()': 15 : error: 'test' was not declared in this scope return test(zz.x.y); ^ 15 : note: suggested alternative: 3 : note: 'X::test' bool test(En e); ^~~~ Compilation failed

If i make y a regular member, rather than a bitfield, code compiles successfully. Calling a name-spaced test works as well. Clang compiles the program as-is without any complains.

Putting bitfield business aside (I do not love it at all, but codebase has it) and not focusing on whether I have a guarantee of fitting an enum into the 16-bit member or not, is there something special regarding bitfields which prevents ADL from kicking in as I expect it?

Audiometer answered 1/8, 2016 at 19:3 Comment(3)
If you explicitly set type: enum En : short it compilesGerdi
@hauron, interesting observation, thanks.Audiometer
enum En : int doesn't compile either. Minimal example doesn't require the union to trigger the bug. The error is preceded by the warning warning: ‘<anonymous struct>::y’ is too small to hold all values of ‘enum X::En’. If the packed bitfield is large enough to hold the specified type (which defaults to unsigned int) without truncation, it succeeds. It appears as if gcc chooses to ignore parameters which have been subject to truncation as valid hints to ADL.Girdle
U
1

The underlying type of plain enums is implementation-defined:

C++03 standard 7.2/5

The underlying type of an enumeration is an integral type that can represent all the enumerator values defined in the enumeration. It is implementation-defined which integral type is used as the underlying type for an enumeration except that the underlying type shall not be larger than int unless the value of an enumerator cannot fit in an int or unsigned int

The underlying time of struct bitfield enums is also implementation-defined:

C++03 standard 9.6/3

A bit-field shall have integral or enumeration type (3.9.1). It is implementation-defined whether a plain (neither explicitly signed nor unsigned) char, short, int or long bit-field is signed or unsigned.

So because the type of both X::En y:16 and X::En are implementation-defined, the implicit conversion between them is also implementation-defined, which I think explains the ADL difference that you're seeing across compilers.

Undervest answered 26/8, 2016 at 7:11 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.