The problem is exactly as your compiler is telling you; you're not allowed to initialise VLAs. Zack gave an obvious solution in the comments: Remove the initialisation. You'll find working examples in this answer, some of which do permit an initialisation, and others which don't. You'll find more information about that in comments. The following examples are ordered from most sensible (IMHO) to least sensible (which involve using malloc
) for allocating storage for decimal digit sequences representing numbers.
I suggest using the same trick to determine how many bytes are necessary to store an int
value as decimal digits as you'd use for octal: Divide the total number of bits in an int
by 3 and add for any sign and NUL termination. digit_count
could be written as a preprocessor macro like so:
#include <limits.h>
#include <stddef.h>
#include <stdio.h>
#define digit_count(num) (1 /* sign */ \
+ sizeof (num) * CHAR_BIT / 3 /* digits */ \
+ (sizeof (num) * CHAR_BIT % 3 > 0)/* remaining digit */ \
+ 1) /* NUL terminator */
int main(void) {
short short_number = -32767;
int int_number = 32767;
char short_buffer[digit_count(short_number)] = { 0 }; /* initialisation permitted here */
char int_buffer[digit_count(int_number)];
sprintf(short_buffer, "%d", short_number);
sprintf(int_buffer, "%d", int_number);
}
As you can see, one powerful benefit here is that digit_count
can be used for any type of integer without modification: char
, short
, int
, long
, long long
, and the corresponding unsigned
types.
One minor downside by comparison is that you waste a few bytes of storage, particularly for small values like 1
. In many cases, the simplicity of this solution more than makes up for this; The code required to count the decimal digits at runtime will occupy more space in memory than is wasted here.
If you're prepared to throw away the simplicity and generic qualities of the above code, and you really want to count the number of decimal digits, Zacks advice applies: Remove the initialisation. Here's an example:
#include <stddef.h>
#include <stdio.h>
size_t digit_count(int num) {
return snprintf(NULL, 0, "%d", num) + 1;
}
int main(void) {
int number = 32767;
char buffer[digit_count(number)]; /* Erroneous initialisation removed as per Zacks advice */
sprintf(buffer, "%d", number);
}
In response to the malloc
recommendations: The least horrible way to solve this problem is to avoid unnecessary code (eg. calls to malloc
and later on free
). If you don't have to return the object from a function, then don't use malloc
! Otherwise, consider storing into a buffer provided by the caller (via arguments) so that the caller can choose which type of storage to use. It's very rare that this isn't an appropriate alternative to using malloc
.
If you do decide to use malloc
and free
for this, however, do it the least horrible way. Avoid typecasts on the return value of malloc
and multiplications by sizeof (char)
(which is always 1). The following code is an example. Use either of the above methods to calculate the length:
char *buffer = malloc(digit_count(number)); /* Initialisation of malloc bytes not possible */
sprintf(buffer, "%d", number);
... and don't forget to free(buffer);
when you're done with it.
= ""
.snprintf
doesn't care what's in its output buffer beforehand. – Philippiansgcc prog.c -std=c99
or replace new version gcc – Wellpreservedchar fileSizeStr[5] = "";
or using malloc worked. – Potshotmalloc
would be suggested here. – Schonfield= ""
? – Philippianschar buffer[getDigits(number)+1];
, it returned this:Kvothe cripto-tdes-actual # ./zack 27 -> Number: 27
. I'll change my chosen answer to the one @undefined behaviour provided, because it uses yours, it's the most complete one, gave alternate getDigits (including the macro), gave a quite detailed explanation of why using malloc in this case was unnecessary, and adviced to avoid typecasts and multiplication by 1. – Potshot