Possible Duplicate:
Difference between different integer types
What is the difference between uint32 and uint32_t in C/C++?
Are they OS-dependent?
In which case should I use one or another?
Possible Duplicate:
Difference between different integer types
What is the difference between uint32 and uint32_t in C/C++?
Are they OS-dependent?
In which case should I use one or another?
uint32_t
is standard, uint32
is not. That is, if you include <inttypes.h>
or <stdint.h>
, you will get a definition of uint32_t
. uint32
is a typedef in some local code base, but you should not expect it to exist unless you define it yourself. And defining it yourself is a bad idea.
#define writef printf
. Technically, things will work just fine. But it introduces unnecessary confusion. And you'd be surprised how frequently you'll see simple typedefs like this morph over time until a code base has things like typedef uint16_t uint32
, at which point the universe explodes. –
Hazing typedef uint32_t uint32
bad? Surely the argument that someone shouldn't use a sensible typedef because someone could create an insane typedef is wrong. Eg we don't say pencils are evil since a crazy coworker could take a pencil and stab someone with it. –
Duello uint32_t
is defined in the standard, in
<cstdint>
synopsis [cstdint.syn]namespace std {
//...
typedef unsigned integer type uint32_t; // optional
//...
}
uint32
is not, it's a shortcut provided by some compilers (probably as typedef uint32_t uint32
) for ease of use.
<cstdint>
was standard. –
Agram © 2022 - 2024 — McMap. All rights reserved.
uint32_t
. – Mellow