I've seen the numerous questions about counting the number of set bits in an insert type of
input, but why is it useful?
For those looking for algorithms about bit counting, look here:
I've seen the numerous questions about counting the number of set bits in an insert type of
input, but why is it useful?
For those looking for algorithms about bit counting, look here:
You can regard a string of bits as a set
, with a 1 representing membership of the set for the corresponding element. The bit count therefore gives you the population count
of the set.
Practical applications include compression, cryptography and error-correcting codes. See e.g. wikipedia.org/wiki/Hamming_weight and wikipedia.org/wiki/Hamming_distance.
If you're rolling your own parity scheme, you might want to count the number of bits. (In general, of course, I'd rather use somebody else's.) If you're emulating an old computer and want to keep track of how fast it would have run on the original, some had multiplication instructions whose speed varied with the number of 1 bits.
I can't think of any time I've wanted to do it over the past ten years or so, so I suspect this is more of a programming exercise than a practical need.
In an ironic sort of fashion, it's useful for an interview question because it requires some detailed low-level thinking and doesn't seem to be taught as a standard algorithm in comp sci courses.
Some people like to use bitmaps to indicate presence/absence of "stuff".
There's a simple hack to isolate the least-significant 1 bit in a word, convert it to a field of ones in the bits below it, and then you can find the bit number by counting the 1-bits.
countbits((x XOR (x-1)))-1;
Watch it work.
Let x = 00101100
Then x-1 = 00101011
x XOR x-1 = 00000111
Which has 3 bits set, so bit 2 was the least-significant 1-bit in the original word
© 2022 - 2024 — McMap. All rights reserved.
POPCNT
or similar). – Muster