8 bits is called "byte". How is 16 bits called? "Short"? "Word"?
And what about 32 bits? I know "int" is CPU-dependent, I'm interested in universally applicable names.
8 bits is called "byte". How is 16 bits called? "Short"? "Word"?
And what about 32 bits? I know "int" is CPU-dependent, I'm interested in universally applicable names.
A byte is the smallest unit of data that a computer can work with. The C language defines char
to be one "byte" and has CHAR_BIT
bits. On most systems this is 8 bits.
A word on the other hand, is usually the size of values typically handled by the CPU. Most of the time, this is the size of the general-purpose registers. The problem with this definition, is it doesn't age well.
For example, the MS Windows WORD
datatype was defined back in the early days, when 16-bit CPUs were the norm. When 32-bit CPUs came around, the definition stayed, and a 32-bit integer became a DWORD
. And now we have 64-bit QWORD
s.
Far from "universal", but here are several different takes on the matter:
BYTE
- 8 bits, unsignedWORD
- 16 bits, unsignedDWORD
- 32 bits, unsigned QWORD
- 64 bits, unsignedGDB:
uint8_t
- 8 bits, unsigneduint16_t
- 16 bits, unsigneduint32_t
- 32 bits, unsigneduint64_t
- 64 bits, unsigneduintptr_t
- pointer-sized integer, unsigned(Signed types exist as well.)
If you're trying to write portable code that relies upon the size of a particular data type (e.g. you're implementing a network protocol), always use <stdint.h>
.
There's no universal name for 16-bit or 32-bit units of measurement.
The term 'word' is used to describe the number of bits processed at a time by a program or operating system. So, in a 16-bit CPU, the word length is 16 bits. In a 32-bit CPU, the word length is 32 bits. I also believe the term is a little flexible, so if I write a program that does all its processing in chunks of say, 10 bits, I could refer to those 10-bit chunks as 'words'.
And just to be clear; 'int' is not a unit of measurement for computer memory. It really is just the data type used to store integer numbers (i.e. numbers with a decimal component of zero). So if you find a way to implement integers using only 2 bits (or whatever) in your programming language, that would still be an int.
Dr. Werner Buchholz coined the word byte to mean, "a unit of digital information to describe an ordered group of bits, as the smallest amount of data that a computer could process." Therefore, the word's actual meaning is dependent on the machine in question's architecture. The number of bits in a byte is therefore arbitrary, and could be 8, 16, or even 32.
For a thorough dissertation on the subject, refer to Wikipedia.
CHAR_BIT
is 16 or 32: https://mcmap.net/q/75761/-is-char_bit-ever-gt-8 –
Trinitrocresol short
, word
and int
are all dependent on the compiler and/or architecture.
int
is a datatype and is usually 32-bit on desktop 32-bit or 64-bit systems. I don't think it's ever larger than the register size of the underlying hardware, so it should always be a fast (and usually large enough) datatype for common uses.short
may be of smaller size then int
, that's all you know. In practice, they're usually 16-bit, but you cannot depend on it.word
is not a datatype, it rather denotes the natural register size of the underlying hardware.And regarding the names of 16 or 32 bits, there aren't any. There is no reason to label them.
int
has been required to be at least 16 bits (by the "at least" requirements on INT_MIN and INT_MAX). So on a sub-16-bit CPU, int
will be larger than the registers. (At least, it will in strict-ANSI mode...) –
Almsman (Nigh 11 years later...)
WikiPedia offers several names on their Units of Information page, including "doublet" and "chomp" for 16-bit quantities, and "quadlet" and "octlet" for 32- and 64-bit quantities.
Of note, many of these terms are cited to be from either manufacturers' documentation or from standards organizations (such as "IEEE Standard for a 32-bit Microprocessor Architecture", which defines doublets, quadlets, octlets, and hexlets).
Also of note, some of these terms are specified in terms of bytes rather than bits, so they are still technically imprecise, as multiple other answers have cautioned about.
I particularly liked the "sniff," the "snort," and the "tribble" for 1-, 2-, and 3-bit elements. I'll have to make an effort to work those into my conversations...
I used to ear them referred as byte, word and long word. But as others mention it is dependant on the native architecture you are working on.
There aren't any universal terms for 16 and 32 bits. The size of a word is machine dependent.
© 2022 - 2025 — McMap. All rights reserved.