Create char array of integer using digits as size
Asked Answered
P

5

7

I am trying to create a char array in C, to fill it with the digits of an int, but the int can be of any number of digits.

I'm using a created function called getDigits(int num), that returns a number of digits the int has.

char buffer[getDigits(number)] = "";
snprintf(buffer, sizeof(buffer),"%d",number);

but when I compile using gcc, it returns:

error: variable-sized object may not be initialized

I've tried everything. When I declare it as char fileSizeStr[5] = "";, it works. I can see the problem is rising when I try to declare the buffer size dynamically, but I would really like to know if is a way of achieving this.

Potshot answered 7/6, 2013 at 0:25 Comment(9)
use c99 compiler or use mallocWellpreserved
Try removing the = "". snprintf doesn't care what's in its output buffer beforehand.Philippians
gcc prog.c -std=c99 or replace new version gccWellpreserved
Thanks, using malloc worked great!Potshot
@Zack I tried that before, and using sprintf. But at this point only declaring the buffer like char fileSizeStr[5] = ""; or using malloc worked.Potshot
@Zack Write an answer... It's a joke that malloc would be suggested here.Schonfield
@Potshot Exactly what happened when you removed the = ""?Philippians
Let us see if anyone can come up with a better answer than what I provided...Schonfield
@Zack, I really don't see why it didn't worked before, but your solution works now. Without the initialisation and changing the buffer declaration to char buffer[getDigits(number)+1];, it returned this: Kvothe cripto-tdes-actual # ./zack 27 -> Number: 27. I'll change my chosen answer to the one @undefined behaviour provided, because it uses yours, it's the most complete one, gave alternate getDigits (including the macro), gave a quite detailed explanation of why using malloc in this case was unnecessary, and adviced to avoid typecasts and multiplication by 1.Potshot
S
16

The problem is exactly as your compiler is telling you; you're not allowed to initialise VLAs. Zack gave an obvious solution in the comments: Remove the initialisation. You'll find working examples in this answer, some of which do permit an initialisation, and others which don't. You'll find more information about that in comments. The following examples are ordered from most sensible (IMHO) to least sensible (which involve using malloc) for allocating storage for decimal digit sequences representing numbers.


I suggest using the same trick to determine how many bytes are necessary to store an int value as decimal digits as you'd use for octal: Divide the total number of bits in an int by 3 and add for any sign and NUL termination. digit_count could be written as a preprocessor macro like so:

#include <limits.h>
#include <stddef.h>
#include <stdio.h>

#define digit_count(num) (1                                /* sign            */ \
                        + sizeof (num) * CHAR_BIT / 3      /* digits          */ \
                        + (sizeof (num) * CHAR_BIT % 3 > 0)/* remaining digit */ \
                        + 1)                               /* NUL terminator  */

int main(void) {
    short short_number = -32767;
    int int_number = 32767;
    char short_buffer[digit_count(short_number)] = { 0 }; /* initialisation permitted here */
    char int_buffer[digit_count(int_number)];
    sprintf(short_buffer, "%d", short_number);
    sprintf(int_buffer, "%d", int_number);
}

As you can see, one powerful benefit here is that digit_count can be used for any type of integer without modification: char, short, int, long, long long, and the corresponding unsigned types.

One minor downside by comparison is that you waste a few bytes of storage, particularly for small values like 1. In many cases, the simplicity of this solution more than makes up for this; The code required to count the decimal digits at runtime will occupy more space in memory than is wasted here.


If you're prepared to throw away the simplicity and generic qualities of the above code, and you really want to count the number of decimal digits, Zacks advice applies: Remove the initialisation. Here's an example:

#include <stddef.h>
#include <stdio.h>

size_t digit_count(int num) {
    return snprintf(NULL, 0, "%d", num) + 1;
}

int main(void) {
    int number = 32767;
    char buffer[digit_count(number)]; /* Erroneous initialisation removed as per Zacks advice */
    sprintf(buffer, "%d", number);
}

In response to the malloc recommendations: The least horrible way to solve this problem is to avoid unnecessary code (eg. calls to malloc and later on free). If you don't have to return the object from a function, then don't use malloc! Otherwise, consider storing into a buffer provided by the caller (via arguments) so that the caller can choose which type of storage to use. It's very rare that this isn't an appropriate alternative to using malloc.

If you do decide to use malloc and free for this, however, do it the least horrible way. Avoid typecasts on the return value of malloc and multiplications by sizeof (char) (which is always 1). The following code is an example. Use either of the above methods to calculate the length:

char *buffer = malloc(digit_count(number)); /* Initialisation of malloc bytes not possible */
sprintf(buffer, "%d", number);

... and don't forget to free(buffer); when you're done with it.

Schonfield answered 7/6, 2013 at 2:45 Comment(0)
D
5

try something like:

 char* buffer =(char *)malloc(getDigits(number)*sizeof(char));

malloc and calloc are used to dinamic allocation.

Digitalism answered 7/6, 2013 at 0:32 Comment(2)
1. Don't cast malloc. 2. There is no point multiplying by sizeof (char) because sizeof (char) is always 1 in C.Schonfield
hum, interesting. By what I have just searched, in C we really don't have to make casts in this case. But C++ programs, we do.. Thanks by the information!Digitalism
M
4

For my money, there is one solution which has gone unmentioned but which is actually simpler than any of the above. There is a combined allocating version of sprintf called "asprintf" available on Linux and most BSD variants. It determines the necessary size, mallocs the memory, and returns the filled string into the first argument.

char * a;
asprintf(&a, "%d", 132);
// use a
free(a);

Using a stack allocated array certainly removes the need for the free, but this completely obviates the need to ever separately calculate the size.

Manyplies answered 14/6, 2013 at 17:28 Comment(1)
note: on systems without asprintf, using C99 it's easy to roll your own using two calls to snprintfHilliary
A
3

below may help

char* buffer;
buffer = (char*)malloc(number * sizeof(char));
Appose answered 7/6, 2013 at 0:34 Comment(5)
1. Don't cast malloc. 2. There is no point multiplying by sizeof (char) because sizeof (char) is always 1 in C.Schonfield
why not cast malloc? I think there is only way to allocate memory in old C.Appose
@mmsc Why you shouldn't cast the value returned by mallocKindless
@NigelHarper, i think what I wrote is casting the return value.Appose
@mmsc Yes, you're casting the return value of malloc and you shouldn't for the reasons explained in the answer I linked.Kindless
S
2

You'll need to use malloc to allocate a dynamic amount of memory.

Initializing the way you did is allowed only if the size is known at compile time.

Senarmontite answered 7/6, 2013 at 0:29 Comment(1)
Thanks! I can't believe I forgot completely of using malloc.Potshot

© 2022 - 2024 — McMap. All rights reserved.