Why should I always enable compiler warnings?
Asked Answered
C

21

355

I often hear that when compiling C and C++ programs I should "always enable compiler warnings". Why is this necessary? How do I do that?

Sometimes I also hear that I should "treat warnings as errors". Should I? How do I do that?

Corvus answered 8/9, 2019 at 14:20 Comment(1)
All the answers assume good faith. For what can be done in bad faith, have a look at the Underhanded C contest. underhanded-c.orgLoraineloralee
C
392

Why should I enable warnings?

C and C++ compilers are notoriously bad at reporting some common programmer mistakes by default, such as:

  • forgetting to initialise a variable
  • forgetting to return a value from a function
  • arguments in printf and scanf families not matching the format string
  • a function is used without being declared beforehand (C only)

These can be detected and reported, just usually not by default; this feature must be explicitly requested via compiler options.

How can I enable warnings?

This depends on your compiler.

Microsoft C and C++ compilers understand switches like /W1, /W2, /W3, /W4 and /Wall. Use at least /W3. /W4 and /Wall may emit spurious warnings for system header files, but if your project compiles cleanly with one of these options, go for it. These options are mutually exclusive.

Most other compilers understand options like -Wall, -Wpedantic and -Wextra. -Wall is essential and all the rest are recommended (note that, despite its name, -Wall only enables the most important warnings, not all of them). These options can be used separately or all together.

Your IDE may have a way to enable these from the user interface.

Why should I treat warnings as errors? They are just warnings!

A compiler warning signals a potentially serious problem in your code. The problems listed above are almost always fatal; others may or may not be, but you want compilation to fail even if it turns out to be a false alarm. Investigate each warning, find the root cause, and fix it. In the case of a false alarm, work around it — that is, use a different language feature or construct so that the warning is no longer triggered. If this proves to be very hard, disable that particular warning on a case by case basis.

You don't want to just leave warnings as warnings even if all of them are false alarms. It could be OK for very small projects where the total number of warnings emitted is less than 7. Anything more, and it's easy for a new warning to get lost in a flood of old familiar ones. Don't allow that. Just cause all your project to compile cleanly.

Note this applies to program development. If you are releasing your project to the world in the source form, then it might be a good idea not to supply -Werror or equivalent in your released build script. People might try to build your project with a different version of the compiler, or with a different compiler altogether, which may have a different set of warnings enabled. You may want their build to succeed. It is still a good idea to keep the warnings enabled, so that people who see warning messages could send you bug reports or patches.

How can I treat warnings as errors?

This is again done with compiler switches. /WX is for Microsoft, most others use -Werror. In either case, the compilation will fail if there are any warnings produced.

Is this enough?

Probably not! As you crank up your optimisation level, the compiler starts looking at the code more and more closely, and this closer scrutiny may reveal more mistakes. Thus, do not be content with the warning switches by themselves, always use them when compiling with optimisations enabled (-O2 or -O3, or /O2 if using MSVC).

Corvus answered 8/9, 2019 at 14:20 Comment(25)
I have posted this Q&A because I'm sick and tired of telling people to enable warnings. Now I can just point them here (or, if I'm in a particularly evil mood, close their question as a dupe). You are welcome to improve this answer or add your own!Corvus
You can also use clang's -WeverythingEphor
warnings (and errors) also get 'lost' due to limits on how many messages gets notified or the compiler ignores duplicates of same warning.Inaccurate
The only modifier I would add is that some warnings may not be helpful for your application. (I've seen warnings that the compiler added 2 bytes of padding between elements in a struct. The application was for prototyping, so a little wasted memory didn't bother us.) Treat all warnings as errors and then only disable a warning if you know why that warning won't help you.Soot
I would maybe add something on the distinction between "programmer errors" i.e., logical errors between what you meant and what you told the compiler via your code (which may otherwise be valid C/C++), and "language errors", i.e. things that aren't a valid use of the language.Strabismus
@KyleA it's possible that the app used char* to access that type somewhere and if it had padding but code wasn't assuming it, it could mean a problem.Sodamide
@Strabismus Language errors are normally reported as errors without any special flags.Corvus
These problems caused by ignoring warning symbols create nasal demonsCommendation
@n.m. yes, I know, I just thought it might be worth adding the distinction because the audience for this question (largely beginners) might not quite understand "...compilers are notoriously bad at reporting some common programmer errors..."Strabismus
@Ephor Using that warning is not recommended by the clang developers because its meaning changes, and may break correct builds.Leesaleese
"C and C++ compilers are notoriously bad at reporting some common programmer mistakes, such as:" Actually, they're notoriously good at it. That's what warnings are. Did you mean "The C and C++ languages notoriously don't make these mistakes hard errors"?Fleury
@LightnessRacesinOrbit There used to be "by default" somewhere in that sentence, but it was lost in editing. Adding back,Corvus
Still seems like a strange way to word it but ok :)Fleury
I would add that you should also make an effort to eliminate as many warnings as possible, even the "harmless" ones, and here's why. Let's say you have a few dozen warnings that you know are minor things that don't matter. You're used to ignoring them, they happen every time. They start to pile up but you don't care because you're used to seeing that many "harmless" warnings every time. Now another, actually serious warning slips in. Would you be able to find it immediately? Or would it disappear amidst all the noise you've trained yourself to ignore all this time?Carol
The downside of treating warnings as errors for people following your default build instructions is that your code rots as compilers add new warnings. Users who download your code and try to build it in the future may be unable to, because their compiler is too new and issues a warning about some extra parentheses or something that your compiler didn't care about. The user who encounters the error isn't responsible for your code or your build system, and has no idea how to turn off treating warnings as errors and actually build your project.Misdirection
I'd add a note that -Werror and cousins make a lot of sense to do in CI, but not so much during daily development; For example, it's reasonable to warn about unused parameters in general but I don't need that to stop compilation when I'm adding a bunch of new functions and just sketching out the general structure.Immobility
@n.m. the fact that there are still so many disagreeing comments under the original question and your answer... shows that your question is well justified! some people still don't get it in the year 2019.Xerophagy
@Misdirection Yes, this happened to me a couple of times. No biggie. If you chose to build an unmaintained piece of software using a compiler it was never tested with, well, you better be prepared to do some mainenance yourself.Corvus
@n.m. Needing no maintenance is the goal state of code; if the requirements haven't changed, the code shouldn't need to. You don't want your environment changing under you and turning working code into broken code. That's why we have things like C99 and POSIX and package-lock.json. Asking the compiler to turn a standardized or versioned set of warnings into errors is a best practice, but telling it to fail the build for your users for an open-ended set of reasons is asking for trouble.Misdirection
@Misdirection A software release is inherently the product of not just the software itself, but its build time and run time environment as well. Someone trying to build the same code with a different compiler is modifying the project, and inherits whatever headaches they cause themselves. It's not a reason to stop ourselves from using good protections in the original build. (And if the new builder really believes the warnings aren't a big deal, they can change that build setting themselves.)Antisocial
@Misdirection In fact there is a lot of truth in what you are saying, it's just not so much relevant to the target audience. I will add a paragraph on it.Corvus
For MSVC, it's usually good to compile with -Wall every so often, but I personally wouldn't recommend combining it with -WX. It enables some warnings that can catch subtle little bugs like "Hey, this class has a virtual function, but you forgot to make the dtor virtual!", but also infomessage "warnings" like "Hey, I optimised this out since you never called it!" or "Hey, I didn't inline this!"Antidisestablishmentarianism
I think it's a good idea to mention selective treating of warnings as errors: e.g. -Werror=return-type -Werror=implicit-function-declaration. This may be good even for released versions of your code.Snooperscope
Because -Weverything -Werror gets you the Hard Mode achievement.Snake
I got thrown off by clang's -Weverything, so I asked a question, did some research, then answered my own question, writing about it here: https://mcmap.net/q/93965/-what-does-the-clang-compiler-39-s-weverything-option-include-and-where-is-it-documented. My final recommendation is -Wall -Wextra -Werror, as even clang recommends that we do not use -Weverything in general.Taxable
V
104

C is, famously, a rather low-level language as HLLs go. C++, though it might seem to be a considerably higher-level language than C, still shares a number of its traits. And one of those traits is that the languages were designed by programmers, for programmers — and, specifically, programmers who knew what they were doing.

(For the rest of this answer I'm going to focus on C. Most of what I'll say also applies to C++, though perhaps not as strongly. Although as Bjarne Stroustrup has famously said, "C makes it easy to shoot yourself in the foot; C++ makes it harder, but when you do it blows your whole leg off.".)

If you know what you are doing — really know what you are doing — sometimes you may have to "break the rules". But most of the time, most of us will agree that well-intentioned rules keep us all out of trouble, and that wantonly breaking those rules all the time is a bad idea.

But in C and C++, there are surprisingly large numbers of things you can do that are "bad ideas", but which aren't formally "against the rules". Sometimes they're a bad idea some of the time (but might be defensible other times); sometimes they're a bad idea virtually all of the time. But the tradition has always been not to warn about these things — because, again, the assumption is that programmers know what they are doing, they wouldn't be doing these things without a good reason, and they'd be annoyed by a bunch of unnecessary warnings.

But of course not all programmers really know what they're doing. And, in particular, every C programmer (no matter how experienced) goes through a phase of being a beginning C programmer. And even experienced C programmers can get careless and make mistakes.

Finally, experience has shown not only that programmers do make mistakes, but that these mistakes can have real, serious consequences. If you make a mistake, and the compiler doesn't warn you about it, and somehow the program doesn't immediately crash or do something obviously wrong because of it, the mistake can lurk there, hidden, sometimes for years, until it causes a really big problem.

So it turns out that, most of the time, warnings are a good idea, after all. Even the experienced programmers have learned that (actually, it's "especially the experienced programmers have learned that"), on balance, the warnings tend to do more good than harm. For every time you did something wrong deliberately and the warning was a nuisance, there are probably at least ten times you did something wrong by accident and the warning saved you from further trouble. And most warnings can be disabled or worked around for those few times when you really want to do the "wrong" thing.

(A classic example of such a "mistake" is the test if(a = b). Most of the time, this is truly a mistake, so most compilers these days warn about it — some even by default. But if you really want to both assign b to a and test the result, you can disable the warning by typing if((a = b)).)

The second question is, why would you want to ask the compiler to treat warnings as errors? I'd say it's because of human nature, specifically, the all-too-easy reaction of saying "Oh, that's just a warning, that's not so important, I'll clean that up later." But if you're a procrastinator (and I don't know about you, but I'm a world-class procrastinator) it's easy to put off the necessary cleanup for basically ever — and if you get into the habit of ignoring warnings, it gets easier and easier to miss an important warning message that's sitting there, unnoticed, in the midst of all the ones you're relentlessly ignoring.

So asking the compiler to treat warnings as errors is a little trick you can play on yourself to get around this human foible, to force yourself to fix the warnings today, because otherwise your program won't compile.

Personally, I'm not as insistent about treating warnings as errors — in fact, if I'm honest, I can say that I don't tend to enable that option in my "personal" programming. But you can be sure I've got that option enabled at work, where our style guide (which I wrote) mandates its use. And I would say — I suspect most professional programmers would say — that any shop that doesn't treat warnings as errors in C is behaving irresponsibly, is not adhering to commonly-accepted industry best practices.

Vicegerent answered 8/9, 2019 at 14:54 Comment(11)
"programmers who knew what they were doing" - LOL; there's a "no true Scotsman" fallacy if ever I saw one :)Splenitis
@Splenitis LOL back atcha. I'm never quite sure I understand the No true Scotsman fallacy, but I like it, so this'll be a good exercise for me. I guess the application here is like this: "No C programmer would ever write if(a = b), therefore we don't need to warn about it." (Then someone produces a list of 10 critical bugs in 10 released products that result from this particular error.) "Okay, no experienced C programmer would ever write that..."Vicegerent
@SteveSummit but a really experienced C programmer may write if (returnCodeFromFoo = foo(bar)) and mean it, to capture and test the code in one place (Assume the only purpose of foo is to have side effects!) The fact that a really really experienced programmer may know this is not a good coding style is beside the point ;)Homograph
The thing is, most very experienced programmers enable most, if not all, warnings. If they do want to use something like if (returnCodeFromFoo = foo(bar)), then they put a comment in and turn off the warning (so that when the maintenance programmer looks at it 4 years later, he/she will realize that the code is intentional. That said, I worked with someone who in (in Microsoft C++ land) insisted that combining /Wall with treating warnings as errors was the way to go. Uh, it isn't (unless you want to put in a lot of suppression comments).Chemise
As someone who does not write code on a daily basis, but when I do it tends to be bare metal (often on a board of my own design) I find warnings invaluable. When stuffing values into internal registers (for a DMA descriptor location is an example) the warning about conversion to pointer means I do a cast to clear the warning. It would not be an error, but if someone else (or even myself!) picks up that code in a few months it could well be confusing. Besides, I apply the no warnings present rule to the outputs of my CAD tools as well.Rambling
Just as a simple example of complable C nonsense, last week I had a loop that wasn't terminating, and found it was due to a typo that works out to about a 2 pixel difference: for (int i=0;i<20,i++) See it? That's a comma, not a semicolon. Comma is an "operator" that returns the value of the right-hand expression, and ++ won't return a 0 for a very long time. Perfectly legit compilable code.Chromophore
The warnings in modern compilers were preceded by lint. It was imperative to run the tool daily or even more often as those older compilers had very few warnings available (particularly on various Unix boxen). As Henry Spencer stated in the 10 commandments for C programmers (annotated edition) "De-linting a program which has never been linted before is often a cleaning of the stables such as thou wouldst not wish on thy worst enemies."Rambling
@PeterSmith I was thinking of mentioning lint. Philosophical, separation-of-functionality issues aside, there was of course also a very practical motivation for having what amounted to a seven-pass compiler (cpp, c0, c1, c2, as, ld, lint): the 64k address space of the PDP-11. Eventually the gcc folks declared that having lint as a separate program was Wrong, that they intended to build full, lint-like capability into the compiler itself -- and they mostly did, except you have to ask for most of those warnings, they're not on by default. Thus this question.Vicegerent
Wheras after a lifetime of practice I am a really good procrastinator :)Woodall
@Chromophore Eye-opening! Although I'd expect that loop to terminate immediately before running the body even once, because it's a postfix-increment operator returning 0 for the first time. I also think it is ill-formed due to missing the second semicolon, so a conforming implementation is supposed to emit a diagnostic.Asshur
If you are outsmarting the compiler, you can typically hint it to silence the warnings . Said hint should be extremely well documented, to help any future maintainer who needs to be as smart as you to maintain your code. In my opinion a build should not present errors or warnings.Loraineloralee
M
42

Warnings consist of the best advice some of the most skilled C++ developers could bake into an application. They're worth keeping around.

C++, being a Turing complete language, has plenty of cases where the compiler must simply trust that you knew what you are doing. However, there are many cases where the compiler can realize that you probably did not intend to write what you wrote. A classic example is printf() codes which don't match the arguments, or std::strings passed to printf (not that that ever happens to me!). In these cases, the code you wrote is not an error. It is a valid C++ expression with a valid interpretation for the compiler to act on. But the compiler has a strong hunch that you simply overlooked something which is easy for a modern compiler to detect. These are warnings. They are things that are obvious to a compiler, using all the strict rules of C++ at its disposal, that you might have overlooked.

Turning warnings off, or ignoring them, is like choosing to ignore free advice from those more skilled than you. It’s a lesson in hubris that ends either when you fly too close to the sun and your wings melt, or a memory corruption error occurs. Between the two, I'll take falling from the sky any day!

"Treat warnings as errors" is the extreme version of this philosophy. The idea here is that you resolve every warning the compiler gives you -- you listen to every bit of free advice and act on it. Whether this is a good model for development for you depends on the team and what kind of product you are working on. It's the ascetic approach that a monk might have. For some, it works great. For others, it does not.

On many of my applications we do not treat warnings as errors. We do this because these particular applications need to compile on several platforms with several compilers of varying ages. Sometimes we find it is actually impossible to fix a warning on one side without it turning into a warning on another platform. So we are merely careful. We respect warnings, but we don't bend over backwards for them.

Manizales answered 9/9, 2019 at 5:27 Comment(19)
What has C++ being Turing complete have to do with that. A lot of languages are turing complete and do not trust you if you do something wrong....Youngster
@KamiKaze every language will have idiomatic mistakes (e.g. Java can't stop you writing an inconsistent equals / hashCode), and it's a quality of implementation issue which of those are reported.Brannen
@KamiKaze The Turing completeness bit comes in to show that there are cases where the compiler cannot prove that your code will fail to work as planned. This is important because compilers cannot make all "wrong" code an error. Errors can only be reserved for behaviors that the language designers are certain will always be "wrong." (typically because it leads down paths that are inconsistent).Manizales
Which also points to the challenge with "all warnings are errors." Warnings are, by design, more opportunistic, triggering on some potentially correct code in exchange for triggering on wrong code more often. Warnings as errors leads to you not being able to exercise the full language's capabilities.Manizales
If you want to disprove that "all warnings are errors", just compile something like constexpr unsigned v = std::integral_constant<unsigned, -1 + 1u>::value; (or more usefully, even just constexpr auto AllSet = -1u;) on MSVC, with -W2.Antidisestablishmentarianism
@CortAmmon 'Errors can only be reserved for behaviors that the language designers are certain will always be "wrong".' Not true. That is the design philosophy C and C++ have chosen, but it is perfectly possible to have a language where things are an error unless the compiler can prove the code is safe (or an explicit escape hatch is used). See Rust for example.Byrnes
@MartinBonner That's an interesting approach I've not looked into before. Does that mean that some Rust code may compile on "smarter" compilers and not compile on others?Manizales
@CortAmmon No. The language standard defines the rules for what counts as "safe".Byrnes
@MartinBonner Ahh. I see what you're getting at. Yes, in other languages, you may wish to use a different word which is closer to capturing that specific language's mindset. The more important part is that it is semantic information about the code being compiled, and thus can invoke Rice's theorem to show that you cannot completely capture this entire class of undesirable programs and refuse to compile them. At best you can capture a subset (such as "safe"). The remainder must be unhandled or handled heuristically (such as warnings).Manizales
@CortAmmon I am a programmer, not a computer scientist, but can't you reject the entire class of unsafe programs provided you are prepared to reject some safe programs?Byrnes
@MartinBonner Yes, if we are careful. We have to reject something syntactic rather than semantic. The dividing line is very precise, but informally we have to be able to reject something based on the text of the code rather than rejecting it based on what the code does when executing. As long as you stick to the text of the code ("reject code if it contains a function call with a recursive path") rather than reject it on what it does ("reject code which causes a stack overflow"), you can do it with compiler errors.Manizales
However, in general, programmers want a language which does more than a "safe" thing. We want a language which does the thing we thought we told it to do. Thus, warnings remain important because the actual class of things we want the computer to do is a semantic class. The compiler can pick away at it by defining "incorrect" or "unsafe," but in the end you still have a superclass of the behaviors the programmer wanted the program to do. Warnings help narrow that superclass down.Manizales
@CortAmmon I think the main problem here is that you give the impression that the warnings a C++ compiler spits out is typical for any Turing complete language, which is certainly not the case. The real reason why C and C++ emits so many warnings is because the standard says that so many things are simply undefined behavior instead of either requiring them to be an error or specify what the result should be. And this is a philosophy that plenty of other languages does not share. And the biggest - if not the only - reason C++ does this is its history of being backwards compatible with C.Bernicebernie
@CortAmmon Another thing is that you give the impression that "Turing completeness" is something that some languages has and some has not. While this is technically true, it paints a false picture. Because it's very uncommon for a language not being Turing complete. And those few languages that are not are usually VERY limited and domain specific. Examples that comes to mind is regular expressions, SQL and HTML. I cannot think of any multi purpose language that is not Turing complete.Bernicebernie
@Bernicebernie I don't think it is just C and C++ that emit warnings this way. If it were, then linters would not have been developed for so many non-C languages. Even famously safe languages like Rust have linters written to add warnings to the language.Manizales
@Bernicebernie Actually there are a lot of languages which are not Turing by complete, way more than just the 3 you mention. That being said, I agree that the general definition of "multi purpose language" is Turing complete, and that is actually why I bring up Turing completeness as the bar. It shows just how far you must go in order to avoid this annoying corner of languages. It isn't just a little thing to be fixed by going from C to Java or Go or Ruby. It's a fundamental mathematical aspect of the wide reaching class of languages we want to use.Manizales
@CortAmmon I'm just saying that C and C++ are pretty extreme when it comes to this. At least with modern standards. Also, I'm just saying that the way you have phrased this answer you really give the impression that it's fairly common with multi purpose languages that are not Turing complete. And there are Turing complete languages that emits maybe 1% of the amounts of warnings a C++ compiler would do. To a VERY large degree, the warnings is about UB.Bernicebernie
Perhaps it is means that the c++ templates are Turing complete, and not as such what the compiled c++ programs can do? https://mcmap.net/q/24093/-c-templates-turing-complete If so you can only see what a given program will compile to by actually compiling it. Static analysis cannot be exhaustive (Halting problem).Loraineloralee
@ThorbjørnRavnAndersen I think the runtime turing completeness is sufficient for the topic of warnings. My go to case for this would be printf("%d", someString) which many compilers give a warning for, while printf(makePercentD(), someString) may be too difficult for static analysis to demonstrate that makePercentD always returns "%d" Turing completness of templates is just a devil in that you can't prove a file is even compilable, much less compile to something that does what you want it to do.Manizales
G
22

Not only does handling the warnings make better code, it makes you a better programmer. Warnings will tell you about things that may seem little to you today, but one day that bad habit will come back and bite your head off.

Use the correct type, return that value, evaluate that return value. Take time and reflect "Is this really the correct type in this context?" "Do I need to return this?" And the biggie; "Is this code going to be portable for the next 10 years?"

Get into the habit of writing warning-free code in the first place.

Gull answered 9/9, 2019 at 7:29 Comment(0)
G
21

Non-fixed warnings will, sooner or later, lead to errors in your code.


Debugging a segmentation fault, for instance, requires the programmer to trace the root (cause) of the fault, which usually is located in a prior place in your code than the line that eventually caused the segmentation fault.

It's very typical that the cause is a line for which the compiler had issued a warning that you ignored, and the line that caused the segmentation fault the line that eventually threw the error.

Fixing the warning leads to fixing the problem... A classic!

A demonstration of the above... Consider the following code:

#include <stdio.h>

int main(void) {
  char* str = "Hello, World!!";
  int idx;

  // Colossal amount of code here, irrelevant to 'idx'

  printf("%c\n", str[idx]);

  return 0;
}

which when compiled with "Wextra" flag passed to GCC, gives:

main.c: In function 'main':
main.c:9:21: warning: 'idx' is used uninitialized in this function [-Wuninitialized]
    9 |   printf("%c\n", str[idx]);
      |                     ^

which I could ignore and execute the code anyway... And then I would witness a "grand" segmentation fault, as my IP Epicurus professor used to say:

Segmentation fault

In order to debug this in a real world scenario, one would start from the line that causes the segmentation fault and attempt to trace what is the root of the cause... They would have to search for what has happened to i and str inside that colossal amount of code over there...

Until, one day, they found themselves in the situation where they discover that idx is used uninitialized, thus it has a garbage value, which results in indexing the string (way) beyond out of its bounds, which leads to a segmentation fault.

If only they hadn't ignored the warning, they would have found the bug immediately!

Grillroom answered 9/9, 2019 at 20:20 Comment(7)
To your title: not necessarily. For instance, a warning suggesting to use parentheses in a formula that really does not need them points to a non-problem that will never ever cause an error. Operator precedences in a given programming language do not change. Ever.Sterile
@MarcvanLeeuwen The instance you quote can turn into error tho, for example if the programmer who doesn't remember the operator precedence correctly modifies the formula a little. The warning tells you: "it might be unclear to someone at some point, add some parentheses to make it more clear". Although one must agree that the title of the original post is not always true.Steven
^ Anything can be turned into an error. It's just as easy to introduce a bug into partially parenthesized code as into fully parenthesized code.Succoth
...As an aside, I have questions for the debugger whose first reaction to "Oh, it segfaulted from str[idx]" isn't "Okay, where are str and idx defined?`Antidisestablishmentarianism
Except if you know 200% what you are doing, then Non-fixed warnings will, sooner or later, lead to errors in your code!Grillroom
You're lucky if you get a segmentation fault. If you are less lucky, you might by chance have idx happen to be the value you expected on your test (not too unlikely if the expected value is 0), and actually happen to point to some sensitive data that should never be printed when deployed.Nanceynanchang
What "IP" in "IP Epicurus professor"? The closest is intellectual property, but that does not fit the context. "P" for "philosophy"? "P" for "program" or "programming"? "Internet programming"? "IoT programming"? Do you mean PI (principal investigator)? Or something else?Birdiebirdlike
P
20

The other answers are excellent and I don't want to repeat what they have said.

One other aspect to "why enable warnings" that hasn't properly been touched on is that they help enormously with code maintenance. When you write a program of significant size, it becomes impossible to keep the whole thing in your head at once. You typically have a function or three that you're actively writing and thinking about, and perhaps a file or three on your screen that you can refer to, but the bulk of the program exists in the background somewhere and you have to trust that it keeps working.

Having warnings on, and having them as energetic and in your face as possible, helps to alert you if something you change makes trouble for something that you can't see.

Take for example, the Clang warning -Wswitch-enum. That triggers a warning if you use a switch on an enum and miss out one of the possible enum values. It's something you might think would be an unlikely mistake to make: you probably at least looked at the list of enum values when you wrote the switch statement. You might even have an IDE that generated the switch options for you, leaving no room for human error.

This warning really comes into its own when, six months later you add another possible entry to the enum. Again, if you're thinking about the code in question you'll probably be fine. But if this enum is used for multiple different purposes and it's for one of those that you need the extra option, it's very easy to forget to update a switch in a file you haven't touched for six months.

You can think of warnings in the same way as you'd think of automated test cases: they help you make sure that the code is sensible and doing what you need when you first write it, but they help even more to make sure that it keeps doing what you need while you prod at it. The difference is that test cases work very narrowly to the requirements of your code and you have to write them, while warnings work broadly to sensible standards for almost all code, and they're very generously supplied by the boffins who make the compilers.

Pupillary answered 9/9, 2019 at 21:52 Comment(2)
The other way they help with maintenance is when you are looking at someone else's code and can't tell whether a side-effect was intentional. With warnings on, you know they were at least aware of the issue.Elaina
Or in my case, you import a file from an embedded system that contains a 3000+ line switch statement over an enum with several thousand values. The "falls through" warnings (avoided by using goto) masked a number of "not handled" bugs... the embedded compiler did not emit either of those, but the bugs were important nonetheless.Woodall
C
16

Treating warnings as errors is just a means of self-discipline: you were compiling a program to test that shiny new feature, but you can't until you fix the sloppy parts. There is no additional information -Werror provides. It just sets priorities very clearly:

Don't add new code until you fix problems in the existing code

It's really the mindset that's important, not the tools. Compiler diagnostics output is a tool. MISRA C (for embedded C) is another tool. It doesn't matter which one you use, but arguably compiler warnings is the easiest tool you can get (it's just one flag to set) and the signal-to-noise ratio is very high. So there's no reason not to use it.

No tool is infallible. If you write const float pi = 3.14;, most tools won't tell you that you defined π with a bad precision which may lead to problems down the road. Most tools won't raise an eyebrow on if(tmp < 42), even if it's commonly known that giving variables meaningless names and using magic numbers is a way to disaster in big projects. You have to understand that any "quick test" code you write is just that: a test, and you have to get it right before you move on to other tasks, while you still see its shortcomings. If you leave that code as is, debugging it after you spend two months adding new features will be significantly harder.

Once you get into the right mindset, there is no point in using -Werror. Having warnings as warnings will allow you to take an informed decision whether it still makes sense to run that debug session you were about to start, or to abort it and fix the warnings first.

Cinchonine answered 9/9, 2019 at 8:30 Comment(3)
For better or for worse, the clippy linting tool for Rust will actually warn about the constant "3.14". It's actually an example in the docs. But as you might guess from the name, clippy takes pride in being aggressively helpful.Amperage
@Amperage Thanks for this example, perhaps I should rephrase my answer in a never say "never" kind of way. I didn't mean to say that checking for imprecise π values is impossible, just that merely getting rid of warnings doesn't guarantee decent code quality.Cinchonine
One thing warnings as errors gives you is that automated builds will fail, thus alerting you that something has gone wrong. Automated builds also allow for automating linting (explosion is 3...2...1.. :)Woodall
C
10

As someone who works with legacy embedded C code, enabling compiler warnings has helped show a lot of weakness and areas to investigate when proposing fixes. In GCC, using -Wall and -Wextra and even -Wshadow have become vital. I'm not going to go every single hazard, but I'll list a few that have popped up that helped show code issues.

Variables being left behind

This one can easily point to unfinished work and areas that might not be using all of the passed variables which could be an issue. Let's look at a simple function that may trigger this:

int foo(int a, int b)
{
   int c = 0;

   if (a > 0)
   {
        return a;
   }
   return 0;
}

Just compiling this without -Wall or -Wextra returns no issues. -Wall will tell you though that c is never used:

foo.c: In function ‘foo’:

foo.c:9:20: warning: unused variable ‘c’ [-Wunused-variable]

-Wextra will also tell you that your parameter b doesn't do anything:

foo.c: In function ‘foo’:

foo.c:9:20: warning: unused variable ‘c’ [-Wunused-variable]

foo.c:7:20: warning: unused parameter ‘b’ [-Wunused-parameter] int foo(int a, int b)

Global Variable shadowing

This one bit hard and did not show up until -Wshadow was used. Let's modify the example above to just add, but there just happens to be a global with the same name as a local which causes a lot of confusion when trying to use both.

int c = 7;

int foo(int a, int b)
{
   int c = a + b;
   return c;
}

When -Wshadow was turned on, it's easy to spot this issue.

foo.c:11:9: warning: declaration of ‘c’ shadows a global declaration [-Wshadow]

foo.c:1:5: note: shadowed declaration is here

Format strings

This doesn't require any extra flags in GCC, but it has still be the source of problems in the past. A simple function trying to print data, but has a formatting error could look like this:

void foo(const char * str)
{
    printf("str = %d\n", str);
}

This doesn't print the string since the formatting flag is wrong and GCC will happily tell you this is probably not what you wanted:

foo.c: In function ‘foo’:

foo.c:10:12: warning: format ‘%d’ expects argument of type ‘int’, but argument 2 has type ‘const char *’ [-Wformat=]


These are just three of the many things the compiler can double check for you. There are a lot of others like using an uninitialized variable that others have pointed out.

Cumings answered 9/9, 2019 at 22:4 Comment(2)
In the embedded world, the warnings that worry me most are "possible loss of precision" and "comparison between signed and unsigned" warnings . I find it difficult to grasp how many "programmers" ignore these (in fact, I am not really sure why they are not errors)Edelstein
In the latter case, @Mawg, I believe the primary reason it's not an error is that the result of sizeof is unsigned, but the default integer type is signed. The sizeof result type, size_t, is typically used for anything related to type size, such as, e.g., alignment or array/container element count, while integers in general are intended to be used as "int unless otherwise required". Considering just how many people are thus taught to use int to iterate over their containers (comparing int to size_t), making it an error would break roughly everything. ;PAntidisestablishmentarianism
S
7

This is a specific answer to C, and why this is far more important to C than to anything else.

#include <stdio.h>

int main()
{
   FILE *fp = "some string";
}

This code compiles with a warning. What are and should be errors in just about every other language on the planet (barring assembly language) are warnings in C. Warnings in C are almost always errors in disguise. Warnings should be fixed, not suppressed.

With GCC, we do this as gcc -Wall -Werror.

This was also the reason for the high rantyness about some Microsoft non-secure API warnings. Most people programming C have learned the hard way to treat warnings as errors and this stuff appeared that just wasn't the same kind of thing and wanted non-portable fixes.

Saskatchewan answered 9/9, 2019 at 21:34 Comment(0)
P
7

Compiler warnings are your friend

I work on legacy Fortran 77 systems. The compiler tells me valuable things: argument data type mismatches on a subroutine call, and using a local variable before a value has been set into the variable, if I have a variable or subroutine argument that is not used. These are almost always errors.

When my code compiles cleanly, 97% it works. The other guy I work with compiles with all warnings off, spends hours or days in the debugger, and then asks me to help. I just compile his code with the warnings on and tell him what to fix.

Pick answered 4/10, 2019 at 1:7 Comment(0)
S
6

You should always enable compiler warnings because the compiler can often tell you what's wrong with your code. To do this, you pass -Wall -Wextra to the compiler.

You should usually treat warnings as errors because the warnings usually signify that there's something wrong with your code. However, it's often very easy to ignore these errors. Therefore, treating them as errors will cause the build to fail so you can't ignore the errors. To treat warnings as errors, pass -Werror to the compiler.

Sting answered 9/9, 2019 at 12:10 Comment(0)
P
6

I once worked for a large (Fortune 50) company that manufactured electronic testing equipment.

The core product of my group was an MFC program that, over the years, came to generate literally hundreds of warnings. Which were ignored in almost all cases.

This is a frigging nightmare when bugs occur.

After that position, I was lucky enough to be hired as the first developer in a new startup.

I encouraged a 'no warning' policy for all builds, with compiler warning levels set to be pretty noisy.

Our practice was to use #pragma warning - push/disable/pop for code that the developer was sure was really fine, along with a log statement at the debug level, just in case.

This practice worked well for us.

Palaeozoic answered 4/10, 2019 at 15:51 Comment(3)
Seconded. #pragma warning doesn't just suppress warnings, it serves the dual purposes of quickly communicating to other programmers that something is intentional and not accidental, and acts as a search tag for quickly locating potentially problematic areas when something breaks but fixing the errors/warnings doesn't fix it.Antidisestablishmentarianism
You're right Justin, that is exactly how I viewed #pragma warningPalaeozoic
Re "Fortune 50": Do you mean Fortune 500?Birdiebirdlike
B
5

The compiler warnings in C++ are very useful for some reasons.

  1. It permits to show you where you can have made a mistake which can impact the final result of your operations. For example, if you didn't initialize a variable or if you use "=" instead of "==" (there are just examples)

  2. It permits also to show you where your code is not conforming to the standard of C++. It's useful, because if the code is conforming to the actual standard it will be easy to move the code into an other platform, for example.

In general, the warnings are very useful to show you where you have mistakes in your code which can affect the result of your algorithm or prevent some error when the user will use your program.

Bristling answered 24/9, 2019 at 12:29 Comment(0)
B
5

A warning is an error waiting to happen. So you must enable compiler warnings and tidy up your code to remove any warning.

Benia answered 2/10, 2019 at 8:24 Comment(0)
A
5

Ignoring warnings means you left sloppy code that not only could cause problems in the future for someone else, but it will also make important compile messages less noticed by you.

The more compiler output, the less anyone will notice or bother. The cleaner the better. It also means you know what you are doing. Warnings are very unprofessional, careless, and risky.

Amero answered 2/10, 2019 at 17:20 Comment(0)
E
4

There's only one problem with treating warnings as errors: When you're using code coming from other sources (e.g., Microsoft libraries, open source projects), they didn't do their job right, and compiling their code generates tons of warnings.

I always write my code so it doesn't generate any warnings or errors, and clean it up until it compiles without generating any extraneous noise. The garbage I have to work with appalls me, and I'm astounded when I have to build a big project and watch a stream of warnings go by where the compilation should only be announcing which files it processed.

I also document my code, because I know the real lifetime cost of software comes mostly from maintenance, not from writing it initially, but that's a different story...

Extrusive answered 3/10, 2019 at 7:18 Comment(2)
Don't knock it, there's good money in consulting work for people who can read compiler warnings out loud to clients.Woodall
Code from other sources generating warnings does not necessary mean that the authors were sloppy. It may also mean that they compiled the code with a different compiler that generated a different set of warnings. Code can compile without warnings on one compiler, and generate warnings on another. Or maybe it's just a different set of warning options; e.g. they used -Wall and you use -Wall -Wextra.Nanceynanchang
S
3

Some warnings may mean a possible semantic error in code or a possible UB. E.g. ; after if(), an unused variable, a global variable masked by local, or comparison of signed and unsigned. Many warnings are related to the static code analyzer in the compiler or to breaches of the ISO standard detectable at compile time, which "require diagnostics". While those occurrences may be legal in one particular case, they would be the result of design issues most of the time.

Some compilers, e.g., GCC, have a command-line option to activate "warnings as errors" mode. It's a nice, if cruel, tool to educate novice coders.

Sodamide answered 8/9, 2019 at 16:59 Comment(0)
C
3

The fact that C++ compilers accept compiling code that obviously results in undefined behavior at all is a major flaw in the compilers. The reason they don't fix this is because doing so would probably break some usable builds.

Most of the warnings should be fatal errors that prevent the build from completing. The defaults to just display errors and do the build anyway are wrong and if you don't override them to treat warnings as errors and leave some warnings then you will likely end up with your program crashing and doing random things.

Coextend answered 4/10, 2019 at 5:35 Comment(3)
Ironically, a lot of undefined behaviour doesn't actually cause warnings, but silently compiles just fine into a nasty little time bomb. ;PAntidisestablishmentarianism
The problem is that if the standard demands an error message, that error message must be issued in all cases where the problem occurs, but never if the problem does not occur. But in cases like undefined behaviour, that may be impossible to decide. For example, consider the following code: int i; if (fun1()) i=2; if (fun2()) i=3; char s="abcde"[i]; This code exhibits undefined behaviour if and only if both fun1() and fun2() can return false on the same function execution. Which may or may not be true, but how is the compiler to tell?Nanceynanchang
While it's not unproblematic with UB, it's still a feature in some sense. It allows the compiler to make optimizations it otherwise could not do. Java code needs to perform bounds checking every time you access an array, resulting in slower code. However, I do agree that MOST of UB should be respecified to force a crash.Bernicebernie
A
0

All percentages are deviating from reality and not meant to be taken seriously.

99% of warnings are completely useless for correctness. However, the 1% makes your code not work (often in rare cases). Importantly other answers miss.

  1. The warnings are from compiler developers. There is a 'C' standard and conformance. But the warning are a sign from compiler developers about problems you are giving them. Ie, these can be things the compiler writers know lead to inefficient or erroneous constructs. It is like ignoring a plumber that say you can not put a toilet there and doing telling them to do it anyways.

  2. The next person who enables warnings will think you are incompetent because you didn't enable warnings. They have no idea that 99% of the code is correct and think that only 50% is.

  3. Another issue often caught by warnings is dead code. Ie, code that can never do anything. This is likely a reason that people hate inheriting code with warnings. 75% of what they are looking at is probably useless.

Warning free code gives other people confidence that code is portable and adaptable to tooling, code updates and general bit rot. Warning free code gives other developer confidence that the code they are looking at is not crazy spaghetti or subtle boloney. They might also just catch an error or two.

Aliciaalick answered 19/7, 2023 at 22:28 Comment(0)
M
-2

You should definitely enable compiler warnings as some compilers are bad at reporting some common programming mistakes, including the following:

  • Initialise a variable gets forgotten
  • Return a value from a function get missed
  • The simple arguments in printf and scanf families not matching the format string
  • A function is used without being declared beforehand, though this happens in C only

So as these functions can be detected and reported, just usually not by default; so this feature must be explicitly requested via compiler options.

Magma answered 1/10, 2019 at 12:6 Comment(0)
R
-37

Take it easy: you don't have to, it is not necessary. -Wall and -Werror was designed by code refactoring maniacs for themselves: it was invented by compiler developers to avoid breaking existing builds after compiler or programming language updates on the user side. The feature is nothing, but all about the decision to break or not to break the build.

It is totally up to your preference to use it or not. I use it all the time because it helps me to fix my mistakes.

Reedbuck answered 9/9, 2019 at 5:3 Comment(6)
Although it is not mandatory, it is highly recommended to use themAerify
-Wall and -Werror was designed by code-refactoring maniacs for themselves. [citation needed]Newborn
It seems like you're contradicting yourself. If you "use it all the time because it helps to fix [your] mistakes," isn't it worth teaching to newer programmers so that they'll be doing it everywhere from the get go? I don't think this question is asking whether or not it's possible to compile without -Wall and -Werror, it's just asking if it's a good idea. Which, from your last sentence, it sounds like you're saying it is.Moa
When you get more experience with maintaining code not written by you, revisit this answer.Loraineloralee
This is not a helpful answer. There are 4 question marks in the OPs question. How many does this reply answer?Dehnel
Re "was designed by code-refactoring maniacs for themselves": Are you trolling? Extreme programming is from about year 2000. Surely, compilers had warnings before then? Don't they rely on unit tests, not compiler warnings?Birdiebirdlike

© 2022 - 2024 — McMap. All rights reserved.