The name of 16 and 32 bits
Asked Answered
E

9

16

8 bits is called "byte". How is 16 bits called? "Short"? "Word"?

And what about 32 bits? I know "int" is CPU-dependent, I'm interested in universally applicable names.

Epithelium answered 6/1, 2013 at 10:5 Comment(1)
I'm interested in universally applicable names. Then the universally applicable names you should use are "16-bits" and "32-bits". If your universally applicable names context precludes byte specifically for 8-bits, you can use "8-bits" or "octet".Schaffel
T
23

A byte is the smallest unit of data that a computer can work with. The C language defines char to be one "byte" and has CHAR_BIT bits. On most systems this is 8 bits.

A word on the other hand, is usually the size of values typically handled by the CPU. Most of the time, this is the size of the general-purpose registers. The problem with this definition, is it doesn't age well.

For example, the MS Windows WORD datatype was defined back in the early days, when 16-bit CPUs were the norm. When 32-bit CPUs came around, the definition stayed, and a 32-bit integer became a DWORD. And now we have 64-bit QWORDs.

Far from "universal", but here are several different takes on the matter:

Windows:

  • BYTE - 8 bits, unsigned
  • WORD - 16 bits, unsigned
  • DWORD - 32 bits, unsigned
  • QWORD - 64 bits, unsigned

GDB:

  • Byte
  • Halfword (two bytes).
  • Word (four bytes).
  • Giant words (eight bytes).

<stdint.h>:

  • uint8_t - 8 bits, unsigned
  • uint16_t - 16 bits, unsigned
  • uint32_t - 32 bits, unsigned
  • uint64_t - 64 bits, unsigned
  • uintptr_t - pointer-sized integer, unsigned

(Signed types exist as well.)

If you're trying to write portable code that relies upon the size of a particular data type (e.g. you're implementing a network protocol), always use <stdint.h>.

Trinitrocresol answered 6/1, 2013 at 10:19 Comment(1)
Spot On. These terms are extremely importand when dealing with Computer Architecture & Assembly Language (low level Programming).Excellency
Z
4

The correct name for a group of exactly 8 bits is really an octet. A byte may have more than or fewer than 8 bits (although this is relatively rare).

Beyond this there are no rigorously well-defined terms for 16 bits, 32 bits, etc, as far as I know.

Zechariah answered 6/1, 2013 at 10:7 Comment(0)
F
4

There's no universal name for 16-bit or 32-bit units of measurement.

The term 'word' is used to describe the number of bits processed at a time by a program or operating system. So, in a 16-bit CPU, the word length is 16 bits. In a 32-bit CPU, the word length is 32 bits. I also believe the term is a little flexible, so if I write a program that does all its processing in chunks of say, 10 bits, I could refer to those 10-bit chunks as 'words'.

And just to be clear; 'int' is not a unit of measurement for computer memory. It really is just the data type used to store integer numbers (i.e. numbers with a decimal component of zero). So if you find a way to implement integers using only 2 bits (or whatever) in your programming language, that would still be an int.

Farleigh answered 6/1, 2013 at 10:18 Comment(0)
A
3

Dr. Werner Buchholz coined the word byte to mean, "a unit of digital information to describe an ordered group of bits, as the smallest amount of data that a computer could process." Therefore, the word's actual meaning is dependent on the machine in question's architecture. The number of bits in a byte is therefore arbitrary, and could be 8, 16, or even 32.

For a thorough dissertation on the subject, refer to Wikipedia.

Arnulfo answered 6/1, 2013 at 10:14 Comment(2)
As the Wikipedia article says, the de facto modern usage is that a "byte" is always 8 bits (even for machines that use 32- or 64-bit words). Other definitions of byte did float around in the early days of computing, but those are effectively archaic usages now.Lyricist
@Lyricist I wouldn't say "archaic"; there are DSP chips where CHAR_BIT is 16 or 32: https://mcmap.net/q/75761/-is-char_bit-ever-gt-8Trinitrocresol
E
1

short, word and int are all dependent on the compiler and/or architecture.

  • int is a datatype and is usually 32-bit on desktop 32-bit or 64-bit systems. I don't think it's ever larger than the register size of the underlying hardware, so it should always be a fast (and usually large enough) datatype for common uses.
  • short may be of smaller size then int, that's all you know. In practice, they're usually 16-bit, but you cannot depend on it.
  • word is not a datatype, it rather denotes the natural register size of the underlying hardware.

And regarding the names of 16 or 32 bits, there aren't any. There is no reason to label them.

Enwreathe answered 6/1, 2013 at 10:17 Comment(1)
From C89, int has been required to be at least 16 bits (by the "at least" requirements on INT_MIN and INT_MAX). So on a sub-16-bit CPU, int will be larger than the registers. (At least, it will in strict-ANSI mode...)Almsman
A
1

(Nigh 11 years later...)

WikiPedia offers several names on their Units of Information page, including "doublet" and "chomp" for 16-bit quantities, and "quadlet" and "octlet" for 32- and 64-bit quantities.

Of note, many of these terms are cited to be from either manufacturers' documentation or from standards organizations (such as "IEEE Standard for a 32-bit Microprocessor Architecture", which defines doublets, quadlets, octlets, and hexlets).

Also of note, some of these terms are specified in terms of bytes rather than bits, so they are still technically imprecise, as multiple other answers have cautioned about.

I particularly liked the "sniff," the "snort," and the "tribble" for 1-, 2-, and 3-bit elements. I'll have to make an effort to work those into my conversations...

Almsman answered 7/10, 2023 at 2:49 Comment(0)
B
0

I used to ear them referred as byte, word and long word. But as others mention it is dependant on the native architecture you are working on.

Bowsprit answered 6/1, 2013 at 10:8 Comment(0)
B
0

They are called 2 bytes and 4 bytes

Bloodstained answered 6/1, 2013 at 10:8 Comment(0)
P
0

There aren't any universal terms for 16 and 32 bits. The size of a word is machine dependent.

Pugilism answered 6/1, 2013 at 10:9 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.