I was expecting the following comparison to give an error:
var A = B = 0;
if(A == B == 0)
console.log(true);
else
console.log(false);
but strangely it returns false
.
Even more strangely,
console.log((A == B == 1));
returns true
.
How does this "ternary" kind of comparison work?
if (A==0 && B==0)
– Dogfish