I'm having trouble understanding the bit logic in these two functions.
I don't know why we are checking for the condition (bitVector & mask) == 0.
Also, why do we OR the bitVector with the mask when the condition is satisfied and AND the bitVector with ~mask otherwise?
Why is there a property such that one can "check that exactly one bit is set by subtracting one from the integer and ANDing it with the original integer"?
Full code here.
/* Toggle the ith bit in the integer. */
public static int toggle(int bitVector, int index) {
if (index < 0) return bitVector;
int mask = 1 << index;
if ((bitVector & mask) == 0) {
bitVector |= mask;
} else {
bitVector &= ~mask;
}
return bitVector;
}
/* Check that exactly one bit is set by subtracting one from the
* integer and ANDing it with the original integer. */
public static boolean checkExactlyOneBitSet(int bitVector) {
return (bitVector & (bitVector - 1)) == 0;
}
toggle
function, wouldreturn bitVector ^ (~bitVector ^ mask);
do the job? – Milo