Why only define a macro if it's not already defined?
Asked Answered
C

7

97

All across our C code base, I see every macro defined the following way:

#ifndef BEEPTRIM_PITCH_RATE_DEGPS
#define BEEPTRIM_PITCH_RATE_DEGPS                   0.2f
#endif

#ifndef BEEPTRIM_ROLL_RATE_DEGPS
#define BEEPTRIM_ROLL_RATE_DEGPS                    0.2f
#endif

#ifndef FORCETRIMRELEASE_HOLD_TIME_MS
#define FORCETRIMRELEASE_HOLD_TIME_MS               1000.0f
#endif

#ifndef TRIMSYSTEM_SHEARPIN_BREAKINGFORCE_LBS
#define TRIMSYSTEM_SHEARPIN_BREAKINGFORCE_LBS       50.0f
#endif

What is the rationale of doing these define checks instead of just defining the macros?

#define BEEPTRIM_PITCH_RATE_DEGPS                   0.2f
#define BEEPTRIM_ROLL_RATE_DEGPS                    0.2f
#define FORCETRIMRELEASE_HOLD_TIME_MS               1000.0f
#define TRIMSYSTEM_SHEARPIN_BREAKINGFORCE_LBS       50.0f

I can't find this practice explained anywhere on the web.

Carencarena answered 4/9, 2015 at 12:53 Comment(3)
Changing constants somewhere else in the code is guaranteed to work this way. If somewhere else someone defines one of those macros, they won't get overwritten by the preprocessor when it parses this file.Tarsia
It's an example of the WET design principle.Trimly
Posted an answer with an example, try compiling it.Tarsia
P
146

This allows you to override the macros when you're compiling:

gcc -DMACRONAME=value

The definitions in the header file are used as defaults.

Pentheus answered 4/9, 2015 at 12:58 Comment(0)
T
51

As I said in the comment, imagine this situation:

foo.h

#define FOO  4

defs.h

#ifndef FOO
#define FOO 6
#endif

#ifndef BAR
#define BAR 4
#endif

bar.c

#include "foo.h"
#include "defs.h"

#include <stdio.h>

int main(void)
{
    printf("%d%d", FOO, BAR);
    return 0;
}

Will print 44.

However, if the conditional ifndef was not there, the result would be compilation warnings of MACRO redefinition and it will print 64.

$ gcc -o bar bar.c
In file included from bar.c:2:0:
defs.h:1:0: warning: "FOO" redefined [enabled by default]
 #define FOO 6
 ^
In file included from bar.c:1:0:
foo.h:1:0: note: this is the location of the previous definition
 #define FOO 4
 ^
Tarsia answered 4/9, 2015 at 13:2 Comment(14)
This is compiler-specific. Redefining an object-like macro is illegal unless the redefinition is "the same" (there's a more technical specification for that, but it's not important here). Illegal code requires a diagnostic and, having issued a diagnostic (here a warning), the compiler is free to do anything, including compile the code with implementation-specific results.Mcripley
If you have conflicting defs for the same macro, wouldn't you rather get the warning in most cases? Rather than silently using the first definition (because the 2nd one uses an ifdef to avoid redefining).Flatulent
@PeterCordes Most of the times, definitions under #infdefs are used as "fallback" or "default" values. Basically, "if the user configured it, fine. If not, let's use a default value."Defeatism
@Angew: Ok, so if you have some #defines in a library header that are part of the the library's ABI, you should not wrap them in #ifndef. (Or better, use an enum). I just wanted to make it clear that #ifndef is only appropriate when having a custom definition for something in one compilation unit but not another is ok. If a.c includes headers in a different order than b.c, they might get different definitions of max(a,b), and one of those definitions might break with max(i++, x), but the other might use temporaries in a GNU statement-expression. Still confusing at least!Flatulent
@PeterCordes What I like to do in that case is #ifdef FOO #error FOO already defined! #endif #define FOO xKawasaki
@Pete it seems you are getting too clever. I am certain I read that C preprocessing has nothing to do with the compiler; and is done before the compiler gets its hands on your code. Infact, the c preprocessor actually turns your source and headers and macro calls into what is called a translation unit. This output is then passed to the compiler. So I am uncertain what undefined behaviour you are talking about. I am really really tired of people using that term when I try to utilize the darn featuresmAdoree
@user13947194: Stack Overflow notified me instead of PeteBecker :/ I'd have to double check, but the ISO C standard does in fact specify the behaviour of the preprocessor, too. The fact that it's (at least logically) a separate pass over the source text doesn't preclude the possibility of compile-time (or preprocess-time) undefined behaviour. That just means the standard doesn't specify anything in particular that a preprocessor / compiler should do for a program that contains UB. Compile-time UB is a thing. #define foo 1 / #define foo 2 isn't a "feature", don't do it.Flatulent
@Pete Cordes firstly let me say I intentionally used just Pete to see what StackOverFlow would do. Now on the matter. I really would like to see that passage in the standard. Then I will ram my head in my moniter and break my keyboard in two as that makes absolutely no sense. C macros are supposed to be simple, primitive caveman technology, what you see you get. I dont see what crime they could be preventing. But luckily, I never saw a compiler that adheres to their standards. Type punning is not allowed in Cpp but every compiler I know supports it. Win32 api is the definition of type punningAdoree
@user13947194: type punning is allowed in ISO C++, using memcpy or C++20 std::bit_cast. All the mainstream C++ compilers support unions for type punning like in C99, but IIRC Sun's CC doesn't. GCC and clang explicitly don't support pointer-cast type punning, unless you compile with -fno-strict-aliasing. It often happens to work anyway in simple cases, but you can construct cases where it doesn't. (Especially with some older versions of GCC, which didn't try to do what the programmer probably meant.)Flatulent
(Re: GCC breaking code that violates strict-aliasing, see gcc, strict-aliasing, and horror stories). If you want a language that works the way you wish it did, C++ is not the right choice for you. Maybe try Rust; it's similarly low-level but sucks less and (mostly?) doesn't have UB. Do note that C or C++ implementations are allowed to define any behaviour they want which the ISO standard leaves undefined. For example, gcc -fno-strict-aliasing defines the behaviour of *(int32_t*) &flt. It may de-facto define what happens with conflicting #defFlatulent
@user13947194: The relevant paragraph in the current ISO C++ standard is eel.is/c++draft/cpp#replace.general-2 - it's not "undefined behaviour", instead "the program is ill-formed", if you re-#define a macro and the two definitions don't match exactly. That's what Pete meant by "Illegal code". Unlike UB, the compiler isn't allowed to do random nonsense, it's required to print a diagnostic. (And then it can do whatever it wants, even handling such a construct as an extension, e.g. taking the first or second definition). eel.is/c++draft/intro#compliance.general-8Flatulent
@Pete Cordes I thank you for your hard research and suggestion to try rust. Cpp is my first language and I have yet to witness these undefined behaviour from my compilers. As I said, big Microsoft, a compiler vendor, gives us an API that simply requires type punning. Same for Linux sockets. Then I should note, Timur Doumler says Cpp does not support type punning, and cpp 2020 bitcast simply gives the same result as type punning. bitcast is after all a function for copying bytes.Adoree
@user13947194: MSVC does define the behaviour of pointer-casting type punning, as an extension to ISO C++. It's also more forgiving of other kinds of unsafe code than GCC or clang. As I said, specific implementations are allowed to define behaviour that the standard leaves undefined. (Terminology note: "type punning" is the process of reinterpreting the bits of an object as another type. Pointer-casting is one way to implement it, one which ISO C++ leaves undefined. A safe portable implementation of bit_cast would use memcpy; only on MSVC could you define it as return (foo_t &)x.)Flatulent
@PeterCordes noted. I am having enough of cpp now and going to try out the rust. I just downloaded mingw codeblocks 20 using cpp 17 and code that used to compile is now error. The usual procedure of casting HMENU to integer ID is causing the error; saying that it loses precision. You are definitely right. I however still get to force the compiler by using -fpermissive flag.Adoree
R
17

I do not know the context but this can be used to give the user the availability to override the values set by those macro definitions. If the user explicitly defines a different value for any of those macros it will be used instead of the values used here.

For instance in g++ you can use the -D flag during compilation to pass a value to a macro.

Raconteur answered 4/9, 2015 at 12:57 Comment(0)
A
14

This is done so that the user of the header file can override the definitions from his/her code or from compiler's -D flag.

Anabatic answered 4/9, 2015 at 12:58 Comment(0)
P
7

Any C project resides on multiple source files. When working on a single source file the checks seem to (and actually) have no point, but when working on a large C project, it's a good practice to check for existing defines before defining a constant. The idea is simple: you need the constant in that specific source file, but it may have been already defined in another.

Petula answered 4/9, 2015 at 12:59 Comment(0)
F
2

You could think about a framework/library that gives to the user a default preset that allow the user to compile and work on it. Those defines are spreaded in different files and the final user is advised to include it's config.h file where he can config its values. If the user forgot some define the system can continue to work because of the preset.

For answered 4/9, 2015 at 13:8 Comment(0)
B
1

Using

#ifndef BEEPTRIM_PITCH_RATE_DEGPS
#define BEEPTRIM_PITCH_RATE_DEGPS                   0.2f
#endif

allows the user to define the value of the macro using the command line argument (in gcc/clang/VS) -DBEEPTRIM_PITCH_RATE_DEGPS=0.3f.

There is another important reason. It is an error to re-define a preprocessor macro differently. See this answer to another SO question. Without the #ifndef check, the compiler should produce an error if -DBEEPTRIM_PITCH_RATE_DEGPS=0.3f is used as a command line argument in the compiler invocation.

Burnout answered 9/9, 2015 at 0:34 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.